Feb 19 21:28:07 crc systemd[1]: Starting Kubernetes Kubelet... Feb 19 21:28:07 crc restorecon[4769]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:07 crc restorecon[4769]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:08 crc restorecon[4769]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 21:28:08 crc restorecon[4769]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 19 21:28:09 crc kubenswrapper[4795]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 21:28:09 crc kubenswrapper[4795]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 19 21:28:09 crc kubenswrapper[4795]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 21:28:09 crc kubenswrapper[4795]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 21:28:09 crc kubenswrapper[4795]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 19 21:28:09 crc kubenswrapper[4795]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.253906 4795 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259407 4795 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259431 4795 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259438 4795 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259443 4795 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259449 4795 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259455 4795 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259460 4795 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259465 4795 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259471 4795 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259477 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259483 4795 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259489 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259502 4795 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259510 4795 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259515 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259521 4795 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259526 4795 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259531 4795 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259537 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259544 4795 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259551 4795 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259557 4795 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259562 4795 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259568 4795 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259575 4795 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259580 4795 feature_gate.go:330] unrecognized feature gate: Example Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259586 4795 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259592 4795 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259599 4795 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259605 4795 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259613 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259618 4795 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259623 4795 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259630 4795 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259636 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259642 4795 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259648 4795 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259653 4795 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259659 4795 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259665 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259671 4795 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259678 4795 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259685 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259692 4795 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259698 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259704 4795 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259709 4795 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259714 4795 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259720 4795 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259725 4795 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259730 4795 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259736 4795 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259740 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259746 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259751 4795 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259756 4795 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259761 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259767 4795 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259772 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259777 4795 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259782 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259787 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259792 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259798 4795 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259803 4795 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259808 4795 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259813 4795 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259818 4795 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259823 4795 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259828 4795 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.259833 4795 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260559 4795 flags.go:64] FLAG: --address="0.0.0.0" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260579 4795 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260592 4795 flags.go:64] FLAG: --anonymous-auth="true" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260600 4795 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260610 4795 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260616 4795 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260625 4795 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260633 4795 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260639 4795 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260645 4795 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260652 4795 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260658 4795 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260664 4795 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260671 4795 flags.go:64] FLAG: --cgroup-root="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260677 4795 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260683 4795 flags.go:64] FLAG: --client-ca-file="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260688 4795 flags.go:64] FLAG: --cloud-config="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260695 4795 flags.go:64] FLAG: --cloud-provider="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260700 4795 flags.go:64] FLAG: --cluster-dns="[]" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260708 4795 flags.go:64] FLAG: --cluster-domain="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260714 4795 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260720 4795 flags.go:64] FLAG: --config-dir="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260726 4795 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260733 4795 flags.go:64] FLAG: --container-log-max-files="5" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260741 4795 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260747 4795 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260753 4795 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260760 4795 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260765 4795 flags.go:64] FLAG: --contention-profiling="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260771 4795 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260777 4795 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260783 4795 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260789 4795 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260797 4795 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260803 4795 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260808 4795 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260815 4795 flags.go:64] FLAG: --enable-load-reader="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260821 4795 flags.go:64] FLAG: --enable-server="true" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260828 4795 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260836 4795 flags.go:64] FLAG: --event-burst="100" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260842 4795 flags.go:64] FLAG: --event-qps="50" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260848 4795 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260854 4795 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260860 4795 flags.go:64] FLAG: --eviction-hard="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260868 4795 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260874 4795 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260880 4795 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260887 4795 flags.go:64] FLAG: --eviction-soft="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260893 4795 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260900 4795 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260906 4795 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260911 4795 flags.go:64] FLAG: --experimental-mounter-path="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260917 4795 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260924 4795 flags.go:64] FLAG: --fail-swap-on="true" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260930 4795 flags.go:64] FLAG: --feature-gates="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260937 4795 flags.go:64] FLAG: --file-check-frequency="20s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260943 4795 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260949 4795 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260955 4795 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260962 4795 flags.go:64] FLAG: --healthz-port="10248" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260968 4795 flags.go:64] FLAG: --help="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260974 4795 flags.go:64] FLAG: --hostname-override="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260979 4795 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260985 4795 flags.go:64] FLAG: --http-check-frequency="20s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260992 4795 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.260998 4795 flags.go:64] FLAG: --image-credential-provider-config="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261003 4795 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261009 4795 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261016 4795 flags.go:64] FLAG: --image-service-endpoint="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261022 4795 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261028 4795 flags.go:64] FLAG: --kube-api-burst="100" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261034 4795 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261040 4795 flags.go:64] FLAG: --kube-api-qps="50" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261046 4795 flags.go:64] FLAG: --kube-reserved="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261052 4795 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261057 4795 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261064 4795 flags.go:64] FLAG: --kubelet-cgroups="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261069 4795 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261075 4795 flags.go:64] FLAG: --lock-file="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261081 4795 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261087 4795 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261093 4795 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261102 4795 flags.go:64] FLAG: --log-json-split-stream="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261108 4795 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261113 4795 flags.go:64] FLAG: --log-text-split-stream="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261121 4795 flags.go:64] FLAG: --logging-format="text" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261127 4795 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261134 4795 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261140 4795 flags.go:64] FLAG: --manifest-url="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261146 4795 flags.go:64] FLAG: --manifest-url-header="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261154 4795 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261177 4795 flags.go:64] FLAG: --max-open-files="1000000" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261185 4795 flags.go:64] FLAG: --max-pods="110" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261191 4795 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261197 4795 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261203 4795 flags.go:64] FLAG: --memory-manager-policy="None" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261209 4795 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261215 4795 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261221 4795 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261227 4795 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261240 4795 flags.go:64] FLAG: --node-status-max-images="50" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261247 4795 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261253 4795 flags.go:64] FLAG: --oom-score-adj="-999" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261259 4795 flags.go:64] FLAG: --pod-cidr="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261265 4795 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261273 4795 flags.go:64] FLAG: --pod-manifest-path="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261279 4795 flags.go:64] FLAG: --pod-max-pids="-1" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261285 4795 flags.go:64] FLAG: --pods-per-core="0" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261291 4795 flags.go:64] FLAG: --port="10250" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261297 4795 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261303 4795 flags.go:64] FLAG: --provider-id="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261309 4795 flags.go:64] FLAG: --qos-reserved="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261315 4795 flags.go:64] FLAG: --read-only-port="10255" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261321 4795 flags.go:64] FLAG: --register-node="true" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261327 4795 flags.go:64] FLAG: --register-schedulable="true" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261332 4795 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261342 4795 flags.go:64] FLAG: --registry-burst="10" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261349 4795 flags.go:64] FLAG: --registry-qps="5" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261355 4795 flags.go:64] FLAG: --reserved-cpus="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261360 4795 flags.go:64] FLAG: --reserved-memory="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261368 4795 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261375 4795 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261381 4795 flags.go:64] FLAG: --rotate-certificates="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261387 4795 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261392 4795 flags.go:64] FLAG: --runonce="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261398 4795 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261405 4795 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261411 4795 flags.go:64] FLAG: --seccomp-default="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261417 4795 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261423 4795 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261429 4795 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261435 4795 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261441 4795 flags.go:64] FLAG: --storage-driver-password="root" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261447 4795 flags.go:64] FLAG: --storage-driver-secure="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261453 4795 flags.go:64] FLAG: --storage-driver-table="stats" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261459 4795 flags.go:64] FLAG: --storage-driver-user="root" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261465 4795 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261471 4795 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261477 4795 flags.go:64] FLAG: --system-cgroups="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261483 4795 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261492 4795 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261498 4795 flags.go:64] FLAG: --tls-cert-file="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261504 4795 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261511 4795 flags.go:64] FLAG: --tls-min-version="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261517 4795 flags.go:64] FLAG: --tls-private-key-file="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261523 4795 flags.go:64] FLAG: --topology-manager-policy="none" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261529 4795 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261564 4795 flags.go:64] FLAG: --topology-manager-scope="container" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261571 4795 flags.go:64] FLAG: --v="2" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261579 4795 flags.go:64] FLAG: --version="false" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261588 4795 flags.go:64] FLAG: --vmodule="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261595 4795 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.261601 4795 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261760 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261767 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261772 4795 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261778 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261785 4795 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261791 4795 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261797 4795 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261802 4795 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261808 4795 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261813 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261818 4795 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261827 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261832 4795 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261837 4795 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261843 4795 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261848 4795 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261853 4795 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261859 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261864 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261869 4795 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261874 4795 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261879 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261884 4795 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261891 4795 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261898 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261905 4795 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261911 4795 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261917 4795 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261922 4795 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261927 4795 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261932 4795 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261938 4795 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261943 4795 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261948 4795 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261953 4795 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261958 4795 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261965 4795 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261972 4795 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261977 4795 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261983 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261989 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.261995 4795 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262000 4795 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262009 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262018 4795 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262026 4795 feature_gate.go:330] unrecognized feature gate: Example Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262035 4795 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262042 4795 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262049 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262055 4795 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262062 4795 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262069 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262076 4795 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262083 4795 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262089 4795 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262094 4795 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262099 4795 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262105 4795 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262110 4795 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262115 4795 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262120 4795 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262127 4795 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262133 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262138 4795 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262143 4795 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262148 4795 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262154 4795 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262178 4795 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262184 4795 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262189 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.262195 4795 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.262986 4795 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.274982 4795 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.275035 4795 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275215 4795 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275231 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275241 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275251 4795 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275260 4795 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275270 4795 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275279 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275288 4795 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275299 4795 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275309 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275318 4795 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275327 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275336 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275344 4795 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275353 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275361 4795 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275369 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275378 4795 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275390 4795 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275401 4795 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275411 4795 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275419 4795 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275430 4795 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275439 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275447 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275455 4795 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275464 4795 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275474 4795 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275482 4795 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275490 4795 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275499 4795 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275509 4795 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275518 4795 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275527 4795 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275535 4795 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275544 4795 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275552 4795 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275560 4795 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275572 4795 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275581 4795 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275590 4795 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275598 4795 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275607 4795 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275615 4795 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275623 4795 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275631 4795 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275640 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275648 4795 feature_gate.go:330] unrecognized feature gate: Example Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275657 4795 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275667 4795 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275677 4795 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275686 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275694 4795 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275702 4795 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275711 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275719 4795 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275727 4795 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275735 4795 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275744 4795 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275752 4795 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275761 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275769 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275777 4795 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275786 4795 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275798 4795 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275811 4795 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275822 4795 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275831 4795 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275840 4795 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275849 4795 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.275858 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.275873 4795 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276162 4795 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276200 4795 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276211 4795 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276221 4795 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276230 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276239 4795 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276247 4795 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276256 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276265 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276273 4795 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276282 4795 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276291 4795 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276301 4795 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276313 4795 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276325 4795 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276335 4795 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276345 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276355 4795 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276364 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276373 4795 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276381 4795 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276389 4795 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276398 4795 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276406 4795 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276416 4795 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276425 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276462 4795 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276472 4795 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276483 4795 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276491 4795 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276502 4795 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276512 4795 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276521 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276530 4795 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276538 4795 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276547 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276556 4795 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276565 4795 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276573 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276582 4795 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276590 4795 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276599 4795 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276608 4795 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276616 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276627 4795 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276638 4795 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276648 4795 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276657 4795 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276668 4795 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276679 4795 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276689 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276698 4795 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276708 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276717 4795 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276727 4795 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276736 4795 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276745 4795 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276754 4795 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276762 4795 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276771 4795 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276780 4795 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276790 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276798 4795 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276806 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276814 4795 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276823 4795 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276831 4795 feature_gate.go:330] unrecognized feature gate: Example Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276839 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276848 4795 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276856 4795 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.276865 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.276877 4795 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.277212 4795 server.go:940] "Client rotation is on, will bootstrap in background" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.288348 4795 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.288523 4795 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.290473 4795 server.go:997] "Starting client certificate rotation" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.290521 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.290679 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-10 15:45:51.324563235 +0000 UTC Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.290765 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.319316 4795 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 21:28:09 crc kubenswrapper[4795]: E0219 21:28:09.321936 4795 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.322589 4795 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.338399 4795 log.go:25] "Validated CRI v1 runtime API" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.380887 4795 log.go:25] "Validated CRI v1 image API" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.383026 4795 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.389696 4795 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-19-21-23-30-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.389728 4795 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.414650 4795 manager.go:217] Machine: {Timestamp:2026-02-19 21:28:09.410915793 +0000 UTC m=+0.603433747 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5c733625-a853-45dd-88a0-4f8c78e571ae BootID:a0a3e1a0-f657-4990-877d-cc9b59bcb3d8 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:83:4c:2b Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:83:4c:2b Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:54:d1:0f Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:47:9b:f5 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:af:e2:d5 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:41:b2:56 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:4c:26:a3 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:d2:0b:28:b8:2b:ab Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:82:6a:3d:56:d4:d3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.415066 4795 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.415350 4795 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.417043 4795 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.417261 4795 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.417304 4795 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.417520 4795 topology_manager.go:138] "Creating topology manager with none policy" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.417531 4795 container_manager_linux.go:303] "Creating device plugin manager" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.418011 4795 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.418039 4795 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.418979 4795 state_mem.go:36] "Initialized new in-memory state store" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.419084 4795 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.422698 4795 kubelet.go:418] "Attempting to sync node with API server" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.422718 4795 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.422739 4795 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.422751 4795 kubelet.go:324] "Adding apiserver pod source" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.422761 4795 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.427790 4795 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.428761 4795 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.429108 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 19 21:28:09 crc kubenswrapper[4795]: E0219 21:28:09.429397 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.429100 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 19 21:28:09 crc kubenswrapper[4795]: E0219 21:28:09.429502 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.431986 4795 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.433345 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.433372 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.433383 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.433393 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.433407 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.433417 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.433426 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.433440 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.433449 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.433458 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.433470 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.433478 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.434375 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.434847 4795 server.go:1280] "Started kubelet" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.435977 4795 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.435894 4795 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 19 21:28:09 crc systemd[1]: Started Kubernetes Kubelet. Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.436947 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.437731 4795 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.440358 4795 server.go:460] "Adding debug handlers to kubelet server" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.440381 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.440426 4795 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.440473 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 16:41:41.267344623 +0000 UTC Feb 19 21:28:09 crc kubenswrapper[4795]: E0219 21:28:09.440686 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.441088 4795 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.441134 4795 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.441223 4795 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.446219 4795 factory.go:55] Registering systemd factory Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.446364 4795 factory.go:221] Registration of the systemd container factory successfully Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.446830 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 19 21:28:09 crc kubenswrapper[4795]: E0219 21:28:09.446939 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:09 crc kubenswrapper[4795]: E0219 21:28:09.447678 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="200ms" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.450031 4795 factory.go:153] Registering CRI-O factory Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.450074 4795 factory.go:221] Registration of the crio container factory successfully Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.450324 4795 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.450447 4795 factory.go:103] Registering Raw factory Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.450550 4795 manager.go:1196] Started watching for new ooms in manager Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.451775 4795 manager.go:319] Starting recovery of all containers Feb 19 21:28:09 crc kubenswrapper[4795]: E0219 21:28:09.449932 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.69:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1895c30d17679f6e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 21:28:09.434816366 +0000 UTC m=+0.627334240,LastTimestamp:2026-02-19 21:28:09.434816366 +0000 UTC m=+0.627334240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.461561 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.461782 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.461940 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.462071 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.462251 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.462391 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.462522 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.462702 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.462835 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.463093 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.463234 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.463517 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.463684 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.463817 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.463929 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.464050 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.464159 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.464312 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.464434 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.464543 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.464654 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.464766 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.464885 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.465707 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.465737 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.465753 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.465770 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.465783 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.465793 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468585 4795 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468637 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468653 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468664 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468698 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468710 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468720 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468730 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468739 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468749 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468758 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468770 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468783 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468794 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468804 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468816 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468827 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468836 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468846 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468860 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468870 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468879 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468889 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468901 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468916 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468928 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468941 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468951 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468965 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468976 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468987 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.468997 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469009 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469019 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469029 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469038 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469049 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469060 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469073 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469084 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469094 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469104 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469114 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469123 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469134 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469145 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469185 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469196 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469229 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469238 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469247 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469257 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469266 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469278 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469288 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469297 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469305 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469314 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469323 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469333 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469343 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469352 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469367 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469376 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469386 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469396 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469405 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469414 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469424 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469435 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469445 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469455 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469465 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469474 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469484 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469493 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469509 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469521 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469532 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469542 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469552 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469562 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469572 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469580 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469591 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469600 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469610 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469620 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469631 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469640 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469649 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469659 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469670 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469680 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469690 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469699 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469708 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469718 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469728 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469737 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469747 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469756 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469766 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469774 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469784 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469794 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469803 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469811 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469826 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469835 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469844 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469852 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469862 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469870 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469878 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469887 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469896 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469905 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469914 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469923 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469935 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469947 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469959 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469970 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469983 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.469994 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470005 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470016 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470026 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470035 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470045 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470055 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470065 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470075 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470090 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470099 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470113 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470125 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470136 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470148 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470178 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470187 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470196 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470206 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470219 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470231 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470249 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470262 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470272 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470281 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470294 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470304 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470317 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470330 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470342 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470353 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470371 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470383 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470393 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470406 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470419 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470433 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470445 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470456 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470469 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470481 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470493 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470504 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470516 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470528 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470541 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470554 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470568 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470580 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470592 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470605 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470659 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470675 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470690 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470704 4795 reconstruct.go:97] "Volume reconstruction finished" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.470712 4795 reconciler.go:26] "Reconciler: start to sync state" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.477343 4795 manager.go:324] Recovery completed Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.485758 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.489908 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.489938 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.489947 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.490824 4795 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.490842 4795 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.490860 4795 state_mem.go:36] "Initialized new in-memory state store" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.506983 4795 policy_none.go:49] "None policy: Start" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.507381 4795 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.508924 4795 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.508967 4795 state_mem.go:35] "Initializing new in-memory state store" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.510301 4795 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.510342 4795 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.510387 4795 kubelet.go:2335] "Starting kubelet main sync loop" Feb 19 21:28:09 crc kubenswrapper[4795]: E0219 21:28:09.510538 4795 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 19 21:28:09 crc kubenswrapper[4795]: W0219 21:28:09.513383 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 19 21:28:09 crc kubenswrapper[4795]: E0219 21:28:09.513452 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:09 crc kubenswrapper[4795]: E0219 21:28:09.540894 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.566722 4795 manager.go:334] "Starting Device Plugin manager" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.566937 4795 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.566956 4795 server.go:79] "Starting device plugin registration server" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.567411 4795 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.567437 4795 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.567700 4795 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.567806 4795 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.567821 4795 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 19 21:28:09 crc kubenswrapper[4795]: E0219 21:28:09.581990 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.611252 4795 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.611381 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.612529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.612564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.612576 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.612688 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.613495 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.613550 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.613566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.614900 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.614930 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.615026 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.615259 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.615311 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.615693 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.615718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.615732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.616042 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.616073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.616086 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.616863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.616899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.616937 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.617068 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.617263 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.617308 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.618036 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.618048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.618069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.618054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.618082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.618085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.618221 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.618456 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.618493 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.619014 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.619049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.619063 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.619242 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.619266 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.619618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.619644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.619658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.620021 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.620054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.620070 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:09 crc kubenswrapper[4795]: E0219 21:28:09.648628 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="400ms" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.668147 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.669102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.669215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.669276 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.669349 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 21:28:09 crc kubenswrapper[4795]: E0219 21:28:09.669926 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.674014 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.674054 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.674081 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.674100 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.674120 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.674143 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.674180 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.674200 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.674221 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.674249 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.674268 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.674286 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.674306 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.674325 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.674343 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775457 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775487 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775539 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775560 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775599 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775559 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775620 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775654 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775634 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775630 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775696 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775796 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775831 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775864 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775879 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775908 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775934 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775950 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.776020 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.775985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.776076 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.776103 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.776132 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.776000 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.776192 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.776280 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.870420 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.871693 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.871737 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.871749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.871773 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 21:28:09 crc kubenswrapper[4795]: E0219 21:28:09.872411 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.952133 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.959864 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.982456 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.993982 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:09 crc kubenswrapper[4795]: I0219 21:28:09.998511 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:10 crc kubenswrapper[4795]: W0219 21:28:10.001705 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-b47a8142a7afb77d258a0a13a9dabc2ad44a1f4fdd97300a5380f1f8e0ed114c WatchSource:0}: Error finding container b47a8142a7afb77d258a0a13a9dabc2ad44a1f4fdd97300a5380f1f8e0ed114c: Status 404 returned error can't find the container with id b47a8142a7afb77d258a0a13a9dabc2ad44a1f4fdd97300a5380f1f8e0ed114c Feb 19 21:28:10 crc kubenswrapper[4795]: W0219 21:28:10.002269 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-8d7155c94d809efc56101f044f5f3d3f5349f8db497db23aac8b4ca9af3b95ad WatchSource:0}: Error finding container 8d7155c94d809efc56101f044f5f3d3f5349f8db497db23aac8b4ca9af3b95ad: Status 404 returned error can't find the container with id 8d7155c94d809efc56101f044f5f3d3f5349f8db497db23aac8b4ca9af3b95ad Feb 19 21:28:10 crc kubenswrapper[4795]: W0219 21:28:10.015068 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-938deda710b7d2e0ae84aedb2ed979bdbd151f4c177d29c2d7bb98cb26e88156 WatchSource:0}: Error finding container 938deda710b7d2e0ae84aedb2ed979bdbd151f4c177d29c2d7bb98cb26e88156: Status 404 returned error can't find the container with id 938deda710b7d2e0ae84aedb2ed979bdbd151f4c177d29c2d7bb98cb26e88156 Feb 19 21:28:10 crc kubenswrapper[4795]: W0219 21:28:10.020564 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-43eac1f67411d904d8e0fa83ad09f4ce6ffed5e108221c037a6a3d964d03acf0 WatchSource:0}: Error finding container 43eac1f67411d904d8e0fa83ad09f4ce6ffed5e108221c037a6a3d964d03acf0: Status 404 returned error can't find the container with id 43eac1f67411d904d8e0fa83ad09f4ce6ffed5e108221c037a6a3d964d03acf0 Feb 19 21:28:10 crc kubenswrapper[4795]: W0219 21:28:10.023288 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-2c227d41d7b6e9f82d3792eb097a80191a34d4db030e203cbd6fdd857458f057 WatchSource:0}: Error finding container 2c227d41d7b6e9f82d3792eb097a80191a34d4db030e203cbd6fdd857458f057: Status 404 returned error can't find the container with id 2c227d41d7b6e9f82d3792eb097a80191a34d4db030e203cbd6fdd857458f057 Feb 19 21:28:10 crc kubenswrapper[4795]: E0219 21:28:10.049455 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="800ms" Feb 19 21:28:10 crc kubenswrapper[4795]: I0219 21:28:10.273556 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:10 crc kubenswrapper[4795]: I0219 21:28:10.275045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:10 crc kubenswrapper[4795]: I0219 21:28:10.275099 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:10 crc kubenswrapper[4795]: I0219 21:28:10.275112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:10 crc kubenswrapper[4795]: I0219 21:28:10.275142 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 21:28:10 crc kubenswrapper[4795]: E0219 21:28:10.275661 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Feb 19 21:28:10 crc kubenswrapper[4795]: W0219 21:28:10.437689 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 19 21:28:10 crc kubenswrapper[4795]: E0219 21:28:10.437770 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:10 crc kubenswrapper[4795]: I0219 21:28:10.437822 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 19 21:28:10 crc kubenswrapper[4795]: I0219 21:28:10.441191 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 08:26:25.123113566 +0000 UTC Feb 19 21:28:10 crc kubenswrapper[4795]: I0219 21:28:10.515258 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b47a8142a7afb77d258a0a13a9dabc2ad44a1f4fdd97300a5380f1f8e0ed114c"} Feb 19 21:28:10 crc kubenswrapper[4795]: I0219 21:28:10.516146 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2c227d41d7b6e9f82d3792eb097a80191a34d4db030e203cbd6fdd857458f057"} Feb 19 21:28:10 crc kubenswrapper[4795]: I0219 21:28:10.517717 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"43eac1f67411d904d8e0fa83ad09f4ce6ffed5e108221c037a6a3d964d03acf0"} Feb 19 21:28:10 crc kubenswrapper[4795]: I0219 21:28:10.519092 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"938deda710b7d2e0ae84aedb2ed979bdbd151f4c177d29c2d7bb98cb26e88156"} Feb 19 21:28:10 crc kubenswrapper[4795]: I0219 21:28:10.520200 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8d7155c94d809efc56101f044f5f3d3f5349f8db497db23aac8b4ca9af3b95ad"} Feb 19 21:28:10 crc kubenswrapper[4795]: E0219 21:28:10.851154 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="1.6s" Feb 19 21:28:10 crc kubenswrapper[4795]: W0219 21:28:10.894716 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 19 21:28:10 crc kubenswrapper[4795]: E0219 21:28:10.895016 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:10 crc kubenswrapper[4795]: W0219 21:28:10.900401 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 19 21:28:10 crc kubenswrapper[4795]: E0219 21:28:10.900544 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:10 crc kubenswrapper[4795]: W0219 21:28:10.950399 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 19 21:28:10 crc kubenswrapper[4795]: E0219 21:28:10.950481 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.076714 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.078123 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.078195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.078214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.078248 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 21:28:11 crc kubenswrapper[4795]: E0219 21:28:11.078839 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.366067 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 19 21:28:11 crc kubenswrapper[4795]: E0219 21:28:11.368398 4795 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.438250 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.441320 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 23:45:30.492106168 +0000 UTC Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.524465 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29"} Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.524541 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c"} Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.524558 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96"} Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.524563 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.524571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd"} Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.525839 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.525877 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.525888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.526609 4795 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f" exitCode=0 Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.526665 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.526674 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f"} Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.527658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.527695 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.527708 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.528563 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a" exitCode=0 Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.528625 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a"} Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.528776 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.530082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.530133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.530152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.530589 4795 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f" exitCode=0 Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.530665 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f"} Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.530741 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.531741 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.531777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.531791 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.532370 4795 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="102e901dae8a92a3ee6f3d2dc5ef2b7b855d256f36c619ddd4723f960f064120" exitCode=0 Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.532397 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"102e901dae8a92a3ee6f3d2dc5ef2b7b855d256f36c619ddd4723f960f064120"} Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.532423 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.533296 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.533331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.533345 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.533722 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.534806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.534841 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.534855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:11 crc kubenswrapper[4795]: I0219 21:28:11.848736 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:12 crc kubenswrapper[4795]: W0219 21:28:12.088554 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 19 21:28:12 crc kubenswrapper[4795]: E0219 21:28:12.088683 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.305313 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.438644 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.441765 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 01:05:37.831126384 +0000 UTC Feb 19 21:28:12 crc kubenswrapper[4795]: E0219 21:28:12.452462 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="3.2s" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.537092 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62"} Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.537150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072"} Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.537184 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136"} Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.537111 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.538009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.538037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.538047 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.540071 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650"} Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.540127 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0"} Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.540140 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b"} Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.540152 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c"} Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.540157 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.540184 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce"} Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.540823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.540850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.540861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.541402 4795 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3" exitCode=0 Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.541421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3"} Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.541464 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.541945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.541972 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.541985 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.542656 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1bd81c001aed1af030c80aa990687a56dd4027ad3e379133e13d2c4aa2f777eb"} Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.542684 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.542714 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.546355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.546395 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.546417 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.546511 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.546528 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.546543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.624243 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.624543 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.624613 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" Feb 19 21:28:12 crc kubenswrapper[4795]: W0219 21:28:12.633268 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 19 21:28:12 crc kubenswrapper[4795]: E0219 21:28:12.633390 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.679268 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.680947 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.680996 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.681008 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:12 crc kubenswrapper[4795]: I0219 21:28:12.681037 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 21:28:12 crc kubenswrapper[4795]: E0219 21:28:12.681666 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.441885 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 18:02:39.983665064 +0000 UTC Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.548955 4795 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6" exitCode=0 Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.549151 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.549287 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.549406 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.549505 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.549420 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.549742 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6"} Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.549809 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.549830 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.551335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.551382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.551393 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.552088 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.552093 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.552135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.552146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.552151 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.552215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.552155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.552254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.552272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.552216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.552317 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:13 crc kubenswrapper[4795]: I0219 21:28:13.552332 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:14 crc kubenswrapper[4795]: I0219 21:28:14.442083 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 05:37:39.256400512 +0000 UTC Feb 19 21:28:14 crc kubenswrapper[4795]: I0219 21:28:14.555214 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451"} Feb 19 21:28:14 crc kubenswrapper[4795]: I0219 21:28:14.555285 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d"} Feb 19 21:28:14 crc kubenswrapper[4795]: I0219 21:28:14.555308 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f"} Feb 19 21:28:14 crc kubenswrapper[4795]: I0219 21:28:14.555314 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:14 crc kubenswrapper[4795]: I0219 21:28:14.555328 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7"} Feb 19 21:28:14 crc kubenswrapper[4795]: I0219 21:28:14.556301 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:14 crc kubenswrapper[4795]: I0219 21:28:14.557902 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:14 crc kubenswrapper[4795]: I0219 21:28:14.557939 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:14 crc kubenswrapper[4795]: I0219 21:28:14.557951 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:14 crc kubenswrapper[4795]: I0219 21:28:14.558656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:14 crc kubenswrapper[4795]: I0219 21:28:14.558711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:14 crc kubenswrapper[4795]: I0219 21:28:14.558727 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.443001 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.443007 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 15:58:58.534075262 +0000 UTC Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.566130 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca"} Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.566274 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.567386 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.567426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.567438 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.882682 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.884026 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.884087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.884108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.884146 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.926291 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.926504 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.927614 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.927659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:15 crc kubenswrapper[4795]: I0219 21:28:15.927671 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:16 crc kubenswrapper[4795]: I0219 21:28:16.443832 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 13:12:11.883067042 +0000 UTC Feb 19 21:28:16 crc kubenswrapper[4795]: I0219 21:28:16.568742 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:16 crc kubenswrapper[4795]: I0219 21:28:16.569470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:16 crc kubenswrapper[4795]: I0219 21:28:16.569497 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:16 crc kubenswrapper[4795]: I0219 21:28:16.569505 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:16 crc kubenswrapper[4795]: I0219 21:28:16.667336 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:16 crc kubenswrapper[4795]: I0219 21:28:16.667496 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:16 crc kubenswrapper[4795]: I0219 21:28:16.668513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:16 crc kubenswrapper[4795]: I0219 21:28:16.668545 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:16 crc kubenswrapper[4795]: I0219 21:28:16.668555 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:17 crc kubenswrapper[4795]: I0219 21:28:17.168447 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 19 21:28:17 crc kubenswrapper[4795]: I0219 21:28:17.443981 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 01:59:55.431854002 +0000 UTC Feb 19 21:28:17 crc kubenswrapper[4795]: I0219 21:28:17.571095 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:17 crc kubenswrapper[4795]: I0219 21:28:17.572419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:17 crc kubenswrapper[4795]: I0219 21:28:17.572459 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:17 crc kubenswrapper[4795]: I0219 21:28:17.572471 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:18 crc kubenswrapper[4795]: I0219 21:28:18.444932 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 00:15:27.553636558 +0000 UTC Feb 19 21:28:18 crc kubenswrapper[4795]: I0219 21:28:18.927299 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 21:28:18 crc kubenswrapper[4795]: I0219 21:28:18.927392 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 21:28:19 crc kubenswrapper[4795]: I0219 21:28:19.445791 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 18:44:38.801576063 +0000 UTC Feb 19 21:28:19 crc kubenswrapper[4795]: E0219 21:28:19.582601 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 19 21:28:20 crc kubenswrapper[4795]: I0219 21:28:20.446503 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 09:26:21.225881096 +0000 UTC Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.246265 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.246468 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.247889 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.248007 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.248032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.254203 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.447124 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 23:07:19.43318457 +0000 UTC Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.577403 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.577623 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.578569 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.578626 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.578644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.579351 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.579946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.579977 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.579993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:21 crc kubenswrapper[4795]: I0219 21:28:21.585980 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:22 crc kubenswrapper[4795]: I0219 21:28:22.447501 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 07:16:37.641335874 +0000 UTC Feb 19 21:28:22 crc kubenswrapper[4795]: I0219 21:28:22.580926 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:22 crc kubenswrapper[4795]: I0219 21:28:22.581792 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:22 crc kubenswrapper[4795]: I0219 21:28:22.581824 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:22 crc kubenswrapper[4795]: I0219 21:28:22.581837 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:23 crc kubenswrapper[4795]: I0219 21:28:23.312359 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 19 21:28:23 crc kubenswrapper[4795]: I0219 21:28:23.312436 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 19 21:28:23 crc kubenswrapper[4795]: I0219 21:28:23.319002 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 19 21:28:23 crc kubenswrapper[4795]: I0219 21:28:23.319049 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 19 21:28:23 crc kubenswrapper[4795]: I0219 21:28:23.448066 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 06:44:28.037567679 +0000 UTC Feb 19 21:28:24 crc kubenswrapper[4795]: I0219 21:28:24.449273 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 05:47:42.017951031 +0000 UTC Feb 19 21:28:25 crc kubenswrapper[4795]: I0219 21:28:25.449973 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 10:42:14.916548207 +0000 UTC Feb 19 21:28:26 crc kubenswrapper[4795]: I0219 21:28:26.450674 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 18:55:43.436638474 +0000 UTC Feb 19 21:28:27 crc kubenswrapper[4795]: I0219 21:28:27.451269 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 09:14:24.512533075 +0000 UTC Feb 19 21:28:27 crc kubenswrapper[4795]: I0219 21:28:27.632214 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:27 crc kubenswrapper[4795]: I0219 21:28:27.632421 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:27 crc kubenswrapper[4795]: I0219 21:28:27.634555 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:27 crc kubenswrapper[4795]: I0219 21:28:27.634637 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:27 crc kubenswrapper[4795]: I0219 21:28:27.634664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:27 crc kubenswrapper[4795]: I0219 21:28:27.638877 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.301887 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.303658 4795 trace.go:236] Trace[622355876]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 21:28:15.779) (total time: 12524ms): Feb 19 21:28:28 crc kubenswrapper[4795]: Trace[622355876]: ---"Objects listed" error: 12524ms (21:28:28.303) Feb 19 21:28:28 crc kubenswrapper[4795]: Trace[622355876]: [12.524563965s] [12.524563965s] END Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.303685 4795 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.304742 4795 trace.go:236] Trace[1224879230]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 21:28:13.437) (total time: 14867ms): Feb 19 21:28:28 crc kubenswrapper[4795]: Trace[1224879230]: ---"Objects listed" error: 14867ms (21:28:28.304) Feb 19 21:28:28 crc kubenswrapper[4795]: Trace[1224879230]: [14.867472332s] [14.867472332s] END Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.304761 4795 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.309653 4795 trace.go:236] Trace[1378771009]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 21:28:17.137) (total time: 11172ms): Feb 19 21:28:28 crc kubenswrapper[4795]: Trace[1378771009]: ---"Objects listed" error: 11172ms (21:28:28.309) Feb 19 21:28:28 crc kubenswrapper[4795]: Trace[1378771009]: [11.172581909s] [11.172581909s] END Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.309686 4795 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.309705 4795 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.309962 4795 trace.go:236] Trace[1492480695]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 21:28:13.684) (total time: 14625ms): Feb 19 21:28:28 crc kubenswrapper[4795]: Trace[1492480695]: ---"Objects listed" error: 14625ms (21:28:28.309) Feb 19 21:28:28 crc kubenswrapper[4795]: Trace[1492480695]: [14.625326642s] [14.625326642s] END Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.309989 4795 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.311112 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.316403 4795 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.435747 4795 apiserver.go:52] "Watching apiserver" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.444608 4795 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.444782 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.445076 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.445273 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.445078 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.445331 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.445357 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.445396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.445403 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.445504 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.445539 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.447940 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.447987 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.448079 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.448231 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.448593 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.450302 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.450486 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.450670 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.450794 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.451802 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 03:36:06.082769997 +0000 UTC Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.473021 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.485310 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.498038 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.509966 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.530197 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.542444 4795 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.543426 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.546701 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58612->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.546750 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58612->192.168.126.11:17697: read: connection reset by peer" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.556146 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.566492 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.595784 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.597457 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650" exitCode=255 Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.597489 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650"} Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.607146 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.608570 4795 scope.go:117] "RemoveContainer" containerID="d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.609978 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610397 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610429 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610450 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610465 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610481 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610517 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610534 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610552 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610567 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610581 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610606 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610620 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610660 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610674 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610690 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610717 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610732 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610748 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610762 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610775 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610801 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610815 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610829 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610846 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610861 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610877 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610893 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.610906 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611016 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611035 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611052 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611106 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611127 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611194 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611214 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611238 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611221 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611263 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611272 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611286 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611307 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611324 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611358 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611381 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611401 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611422 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611445 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611491 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611499 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611512 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611582 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611608 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611619 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611665 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611706 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611737 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611762 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611784 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611818 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611846 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611871 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611898 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611920 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611942 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611968 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611988 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612004 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612052 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612072 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612094 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612117 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612137 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612154 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612199 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612218 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612234 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612257 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612272 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612286 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612301 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612317 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612332 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612351 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612368 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612384 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612399 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612453 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612470 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612485 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612504 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612520 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612536 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612551 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612566 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612583 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612598 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612614 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612629 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612645 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612663 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612679 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612696 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612712 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612728 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612746 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612763 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612780 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612795 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612810 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612825 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612844 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612886 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612902 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612918 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612938 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612954 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612969 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612984 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613002 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613018 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613033 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613051 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613068 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613083 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613099 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613115 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613130 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613145 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613176 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611791 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611812 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611819 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.611849 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612177 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612309 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612356 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612375 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612542 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612548 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612620 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612620 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612802 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612811 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613286 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612818 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613300 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.612904 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613018 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613327 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613034 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613195 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613015 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613424 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613442 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613472 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613510 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613558 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613582 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613600 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613628 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613711 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613714 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613818 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613907 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613926 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613943 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614033 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614065 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614126 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614225 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614245 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.613198 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614314 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614339 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614360 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614380 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614403 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614425 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614447 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614475 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614491 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614516 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614539 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614542 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614563 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614586 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614613 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614633 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.614657 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:28:29.114640875 +0000 UTC m=+20.307158739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614677 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614698 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614720 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614745 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614772 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614794 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614780 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614817 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614867 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614905 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614930 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614951 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614971 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614992 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615037 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615073 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615106 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615129 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615251 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615278 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615312 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615339 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615378 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615400 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615421 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615448 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615466 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615482 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615514 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615531 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615547 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615574 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615590 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615611 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615628 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615726 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615749 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615773 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615810 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615832 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616181 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616213 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616252 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616652 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617265 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617482 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617507 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617573 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617591 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617606 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617641 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617670 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617691 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617709 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617726 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617806 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617822 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617858 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617875 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617893 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617918 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617935 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617974 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617993 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618011 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618029 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618067 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618078 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618087 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618098 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618107 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618118 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618128 4795 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618142 4795 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618153 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618212 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618222 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618233 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619050 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619101 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619202 4795 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619230 4795 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619268 4795 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619285 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619299 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619319 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619354 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619368 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619435 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619454 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619469 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619688 4795 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619711 4795 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619724 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619737 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619782 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619800 4795 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619880 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619899 4795 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619916 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621504 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621524 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621542 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621563 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621577 4795 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621591 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621606 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621625 4795 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621639 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621652 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621665 4795 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621682 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621696 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621713 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621734 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621746 4795 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621761 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621773 4795 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621790 4795 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621804 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614868 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.614864 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615256 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615287 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615590 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615744 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615851 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.615867 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616031 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616130 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616200 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616343 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616490 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616831 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616839 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.616858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617037 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617056 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617230 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617308 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617576 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617648 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.617696 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618033 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618220 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618778 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.618859 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619029 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619859 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.619911 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.620494 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.620861 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.620920 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621327 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621416 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621865 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.621968 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.622234 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.622929 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623054 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623102 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623209 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623403 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623420 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623604 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623749 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623758 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623767 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.624066 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.624971 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.625274 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.629069 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.629333 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630083 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630550 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630573 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630580 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630772 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630813 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630938 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630939 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.631142 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.632459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.632818 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.633054 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.633335 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.633385 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.633562 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.633661 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.633714 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.633911 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.631707 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.634342 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.634378 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.634428 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.634643 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.634765 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.634755 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.634774 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.635034 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.635128 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.630777 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.632205 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.635231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.635288 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.635303 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.635335 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636200 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636212 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636315 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636351 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636355 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636351 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.623075 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.635234 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636841 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636853 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.637039 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.637155 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.637188 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.637379 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.636109 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.637766 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.637870 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638007 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638072 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.637842 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638151 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638136 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638432 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638538 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638581 4795 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.638813 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.639216 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.639305 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.639430 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.639587 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.639677 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.639736 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:29.139719316 +0000 UTC m=+20.332237180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.639912 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.639970 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.640191 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:29.140150029 +0000 UTC m=+20.332667903 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.640205 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.640410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.640612 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.645255 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.645405 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.650663 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.654107 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.654129 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.654142 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.654217 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:29.154200549 +0000 UTC m=+20.346718413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.658007 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.658037 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.658049 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:28 crc kubenswrapper[4795]: E0219 21:28:28.658099 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:29.158082692 +0000 UTC m=+20.350600556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.658778 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.665684 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.666379 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.666503 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.680265 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.680265 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.681748 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.681779 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.682921 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.683123 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.683841 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.684251 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.684982 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.685220 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.686323 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.686351 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.686337 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.686350 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.686628 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.687996 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.693200 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.693485 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.698690 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.701426 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.701904 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.710177 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.711741 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722497 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722542 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722622 4795 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722646 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722658 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722672 4795 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722684 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722696 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722708 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722702 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722719 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722762 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722777 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722793 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722807 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722820 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722840 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722853 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722867 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722879 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722893 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722906 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722918 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722931 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722943 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722955 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722967 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722979 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.722991 4795 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723003 4795 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723015 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723028 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723040 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723051 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723062 4795 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723074 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723085 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723099 4795 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723112 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723123 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723136 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723148 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723184 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723196 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723207 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723219 4795 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723230 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723243 4795 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723254 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723266 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723277 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723289 4795 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723300 4795 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723312 4795 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723323 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723334 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723347 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723359 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723370 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723382 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723395 4795 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723407 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723418 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723432 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723444 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723461 4795 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723473 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723484 4795 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723527 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723540 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723552 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723564 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723576 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723587 4795 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723599 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723611 4795 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723624 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723637 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723648 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723660 4795 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723671 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723682 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723693 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723705 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723716 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723730 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723743 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723755 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723768 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723782 4795 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723794 4795 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723806 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723817 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723829 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723840 4795 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723851 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723863 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723875 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723886 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723898 4795 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723909 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723920 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723932 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723944 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723957 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723971 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723982 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.723993 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724005 4795 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724018 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724030 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724042 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724055 4795 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724068 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724079 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724090 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724102 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724113 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724125 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724136 4795 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724150 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724220 4795 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724232 4795 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724244 4795 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724256 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724268 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724279 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724291 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724424 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724438 4795 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724449 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724461 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724472 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724484 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724496 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724508 4795 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724519 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724531 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724542 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724556 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724567 4795 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724578 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724589 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.724601 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.761295 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.767993 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.773907 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 21:28:28 crc kubenswrapper[4795]: W0219 21:28:28.777884 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-1a2194b11ba3a035a9863493f71d4d63053a4ff1210ab5a016ee0a984f8f0136 WatchSource:0}: Error finding container 1a2194b11ba3a035a9863493f71d4d63053a4ff1210ab5a016ee0a984f8f0136: Status 404 returned error can't find the container with id 1a2194b11ba3a035a9863493f71d4d63053a4ff1210ab5a016ee0a984f8f0136 Feb 19 21:28:28 crc kubenswrapper[4795]: W0219 21:28:28.778641 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-4db24bb73e8db7b7311192e90d95536edc7b4e668b036dc5ed8311816045ea99 WatchSource:0}: Error finding container 4db24bb73e8db7b7311192e90d95536edc7b4e668b036dc5ed8311816045ea99: Status 404 returned error can't find the container with id 4db24bb73e8db7b7311192e90d95536edc7b4e668b036dc5ed8311816045ea99 Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.926930 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 21:28:28 crc kubenswrapper[4795]: I0219 21:28:28.926986 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.127969 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.128174 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:28:30.128132996 +0000 UTC m=+21.320650860 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.229259 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.229313 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.229340 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.229365 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229506 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229503 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229550 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229555 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229581 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229617 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229632 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229564 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:30.229550983 +0000 UTC m=+21.422068847 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229704 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:30.229680607 +0000 UTC m=+21.422198531 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229719 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:30.229711398 +0000 UTC m=+21.422229272 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229564 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:29 crc kubenswrapper[4795]: E0219 21:28:29.229865 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:30.229831221 +0000 UTC m=+21.422349165 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.452435 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 14:34:53.534242081 +0000 UTC Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.514579 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.515378 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.516179 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.516760 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.517376 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.517878 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.518742 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.519411 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.520097 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.522453 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.523187 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.523862 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.524494 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.525002 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.525526 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.526011 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.526587 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.526994 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.527557 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.528082 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.528666 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.529219 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.529639 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.530261 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.530678 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.531362 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.533495 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.533587 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.534457 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.535380 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.536048 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.536716 4795 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.536855 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.540329 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.541105 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.541809 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.545082 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.546010 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.547276 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.548209 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.548738 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.549646 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.550332 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.551141 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.552732 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.554087 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.554788 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.556028 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.556877 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.558416 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.559115 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.561537 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.562758 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.565022 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.565402 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.566338 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.567634 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.582652 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.598252 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.600812 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3"} Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.600855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1a2194b11ba3a035a9863493f71d4d63053a4ff1210ab5a016ee0a984f8f0136"} Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.603083 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.604628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30"} Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.605196 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.606319 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5"} Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.606353 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697"} Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.606368 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c85a50be9fb5e82c64ae1cb4bd5344baef3eff3b4599c6a5e0402ed9421e65c6"} Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.607280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4db24bb73e8db7b7311192e90d95536edc7b4e668b036dc5ed8311816045ea99"} Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.610534 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.621634 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.638696 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.667683 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.682421 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.694785 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.705483 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.715604 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:29 crc kubenswrapper[4795]: I0219 21:28:29.725082 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.141356 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.141564 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:28:32.141534652 +0000 UTC m=+23.334052516 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.242618 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.242664 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.242698 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.242718 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.242784 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.242863 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:32.242845686 +0000 UTC m=+23.435363550 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.242918 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.242795 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.243042 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:32.243011901 +0000 UTC m=+23.435529795 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.242925 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.243103 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.243130 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.242949 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.243208 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.243228 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:32.243207397 +0000 UTC m=+23.435725301 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.243297 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:32.243244648 +0000 UTC m=+23.435762512 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.452780 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 11:51:18.934310671 +0000 UTC Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.511153 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.511197 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:30 crc kubenswrapper[4795]: I0219 21:28:30.511200 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.511302 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.511460 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:30 crc kubenswrapper[4795]: E0219 21:28:30.511674 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.453713 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 20:12:52.653068241 +0000 UTC Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.603212 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.614443 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9"} Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.619316 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.619990 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.620436 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.635911 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.651533 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.663898 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.676278 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.690281 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.702559 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.716388 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.727554 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.739817 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.759292 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.769968 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.782286 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.795567 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:31 crc kubenswrapper[4795]: I0219 21:28:31.808587 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.158110 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.158305 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:28:36.15828061 +0000 UTC m=+27.350798484 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.259238 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.259296 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.259322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.259349 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259469 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259437 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259490 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259529 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:36.259516242 +0000 UTC m=+27.452034106 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259538 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259564 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259576 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:36.259546573 +0000 UTC m=+27.452064467 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259494 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259606 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259619 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259654 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:36.259616705 +0000 UTC m=+27.452134609 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.259681 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:36.259670497 +0000 UTC m=+27.452188451 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.454575 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 18:06:54.616184067 +0000 UTC Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.511429 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.511491 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.511572 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.512151 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.512327 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:32 crc kubenswrapper[4795]: E0219 21:28:32.512532 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.980527 4795 csr.go:261] certificate signing request csr-62htn is approved, waiting to be issued Feb 19 21:28:32 crc kubenswrapper[4795]: I0219 21:28:32.999227 4795 csr.go:257] certificate signing request csr-62htn is issued Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.377238 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-blzsk"] Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.377427 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-fxj5d"] Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.377572 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-blzsk" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.377646 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.379984 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.380239 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.380346 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.380458 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.380512 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.380975 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.380578 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.380989 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.392787 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.424344 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.438539 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.451878 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.454700 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 06:24:44.746485258 +0000 UTC Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.466384 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.484016 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.495531 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.512795 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.524721 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.539445 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.561811 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.570125 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7591bc58-96f5-486a-8653-0ad93938b019-mcd-auth-proxy-config\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.570551 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9643227-37ca-4e4a-b9bc-371b18d67edc-hosts-file\") pod \"node-resolver-blzsk\" (UID: \"e9643227-37ca-4e4a-b9bc-371b18d67edc\") " pod="openshift-dns/node-resolver-blzsk" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.570677 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7591bc58-96f5-486a-8653-0ad93938b019-proxy-tls\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.570754 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmtf8\" (UniqueName: \"kubernetes.io/projected/e9643227-37ca-4e4a-b9bc-371b18d67edc-kube-api-access-lmtf8\") pod \"node-resolver-blzsk\" (UID: \"e9643227-37ca-4e4a-b9bc-371b18d67edc\") " pod="openshift-dns/node-resolver-blzsk" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.570840 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7591bc58-96f5-486a-8653-0ad93938b019-rootfs\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.570912 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwrxf\" (UniqueName: \"kubernetes.io/projected/7591bc58-96f5-486a-8653-0ad93938b019-kube-api-access-wwrxf\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.576388 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.587832 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.608797 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.641035 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.672139 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9643227-37ca-4e4a-b9bc-371b18d67edc-hosts-file\") pod \"node-resolver-blzsk\" (UID: \"e9643227-37ca-4e4a-b9bc-371b18d67edc\") " pod="openshift-dns/node-resolver-blzsk" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.672449 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7591bc58-96f5-486a-8653-0ad93938b019-proxy-tls\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.672555 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmtf8\" (UniqueName: \"kubernetes.io/projected/e9643227-37ca-4e4a-b9bc-371b18d67edc-kube-api-access-lmtf8\") pod \"node-resolver-blzsk\" (UID: \"e9643227-37ca-4e4a-b9bc-371b18d67edc\") " pod="openshift-dns/node-resolver-blzsk" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.672655 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7591bc58-96f5-486a-8653-0ad93938b019-rootfs\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.672749 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwrxf\" (UniqueName: \"kubernetes.io/projected/7591bc58-96f5-486a-8653-0ad93938b019-kube-api-access-wwrxf\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.672851 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7591bc58-96f5-486a-8653-0ad93938b019-mcd-auth-proxy-config\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.672297 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9643227-37ca-4e4a-b9bc-371b18d67edc-hosts-file\") pod \"node-resolver-blzsk\" (UID: \"e9643227-37ca-4e4a-b9bc-371b18d67edc\") " pod="openshift-dns/node-resolver-blzsk" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.672761 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7591bc58-96f5-486a-8653-0ad93938b019-rootfs\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.673597 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7591bc58-96f5-486a-8653-0ad93938b019-mcd-auth-proxy-config\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.676492 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7591bc58-96f5-486a-8653-0ad93938b019-proxy-tls\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.700151 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.705010 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmtf8\" (UniqueName: \"kubernetes.io/projected/e9643227-37ca-4e4a-b9bc-371b18d67edc-kube-api-access-lmtf8\") pod \"node-resolver-blzsk\" (UID: \"e9643227-37ca-4e4a-b9bc-371b18d67edc\") " pod="openshift-dns/node-resolver-blzsk" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.710122 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-blzsk" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.731094 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwrxf\" (UniqueName: \"kubernetes.io/projected/7591bc58-96f5-486a-8653-0ad93938b019-kube-api-access-wwrxf\") pod \"machine-config-daemon-fxj5d\" (UID: \"7591bc58-96f5-486a-8653-0ad93938b019\") " pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.774351 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.783980 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-l29c7"] Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.784613 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-5p6d9"] Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.784794 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.784847 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.787319 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.787553 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.787785 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.787920 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.788134 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.790487 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.792787 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.792805 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.804636 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.817335 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.833214 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.845205 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.857530 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.873144 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.885658 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.901923 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.913414 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.926458 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.941883 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.970330 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.974749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-hostroot\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.974794 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx6hj\" (UniqueName: \"kubernetes.io/projected/e967392b-9bd8-4111-b1b9-96d503a19668-kube-api-access-qx6hj\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.974823 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cni-binary-copy\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.974847 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-socket-dir-parent\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.974917 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-kubelet\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.974973 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-multus-certs\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975069 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-os-release\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975095 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-system-cni-dir\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975119 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975141 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-system-cni-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975195 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e967392b-9bd8-4111-b1b9-96d503a19668-cni-binary-copy\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975224 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-k8s-cni-cncf-io\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975246 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-conf-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975271 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-netns\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975302 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e967392b-9bd8-4111-b1b9-96d503a19668-multus-daemon-config\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975352 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975377 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-etc-kubernetes\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cnibin\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975445 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-cni-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-cnibin\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-os-release\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975531 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4pzd\" (UniqueName: \"kubernetes.io/projected/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-kube-api-access-f4pzd\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975563 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-cni-multus\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.975595 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-cni-bin\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:33 crc kubenswrapper[4795]: I0219 21:28:33.983432 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.000099 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-19 21:23:32 +0000 UTC, rotation deadline is 2026-12-20 18:37:00.677622872 +0000 UTC Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.000407 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7293h8m26.677221874s for next certificate rotation Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.000266 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:33Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.005346 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:28:34 crc kubenswrapper[4795]: W0219 21:28:34.014917 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7591bc58_96f5_486a_8653_0ad93938b019.slice/crio-d574fc53e044370889753eeb0c62a38953bf809ef1ccb2d1baf5e98adf4e9947 WatchSource:0}: Error finding container d574fc53e044370889753eeb0c62a38953bf809ef1ccb2d1baf5e98adf4e9947: Status 404 returned error can't find the container with id d574fc53e044370889753eeb0c62a38953bf809ef1ccb2d1baf5e98adf4e9947 Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.076914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-cnibin\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.076978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-os-release\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077015 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-cni-multus\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4pzd\" (UniqueName: \"kubernetes.io/projected/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-kube-api-access-f4pzd\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077102 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-cni-bin\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077146 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-hostroot\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077192 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx6hj\" (UniqueName: \"kubernetes.io/projected/e967392b-9bd8-4111-b1b9-96d503a19668-kube-api-access-qx6hj\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077216 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cni-binary-copy\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077238 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-socket-dir-parent\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077260 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-kubelet\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077283 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-multus-certs\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077330 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-os-release\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-system-cni-dir\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077382 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077405 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-system-cni-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077427 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e967392b-9bd8-4111-b1b9-96d503a19668-cni-binary-copy\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077451 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-k8s-cni-cncf-io\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077475 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-conf-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077527 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-netns\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077552 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e967392b-9bd8-4111-b1b9-96d503a19668-multus-daemon-config\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077584 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077610 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-etc-kubernetes\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077638 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cnibin\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077662 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-cni-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.077957 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-hostroot\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078072 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-socket-dir-parent\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078086 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-cni-multus\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-system-cni-dir\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078046 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-conf-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078373 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-system-cni-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078416 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-cnibin\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-os-release\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078470 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-kubelet\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078489 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-var-lib-cni-bin\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078506 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-multus-certs\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078549 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-etc-kubernetes\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078600 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-netns\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078728 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-os-release\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078811 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cni-binary-copy\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078853 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cnibin\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078934 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-host-run-k8s-cni-cncf-io\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.078956 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.079009 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.079097 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e967392b-9bd8-4111-b1b9-96d503a19668-cni-binary-copy\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.079307 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e967392b-9bd8-4111-b1b9-96d503a19668-multus-cni-dir\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.079469 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e967392b-9bd8-4111-b1b9-96d503a19668-multus-daemon-config\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.094104 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx6hj\" (UniqueName: \"kubernetes.io/projected/e967392b-9bd8-4111-b1b9-96d503a19668-kube-api-access-qx6hj\") pod \"multus-5p6d9\" (UID: \"e967392b-9bd8-4111-b1b9-96d503a19668\") " pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.094315 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4pzd\" (UniqueName: \"kubernetes.io/projected/c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3-kube-api-access-f4pzd\") pod \"multus-additional-cni-plugins-l29c7\" (UID: \"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\") " pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.095800 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l29c7" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.105210 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5p6d9" Feb 19 21:28:34 crc kubenswrapper[4795]: W0219 21:28:34.119303 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode967392b_9bd8_4111_b1b9_96d503a19668.slice/crio-4d4b0b683458da919192e168b950b3bebf6b1cfe448325e37e3777ebb0f2e9ff WatchSource:0}: Error finding container 4d4b0b683458da919192e168b950b3bebf6b1cfe448325e37e3777ebb0f2e9ff: Status 404 returned error can't find the container with id 4d4b0b683458da919192e168b950b3bebf6b1cfe448325e37e3777ebb0f2e9ff Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.144013 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4qphl"] Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.146118 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.147572 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.148235 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.148575 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.148672 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.148734 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.149070 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.150011 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.165102 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.175502 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.186682 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.199358 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.210726 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.220142 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.231276 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.251940 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.266644 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.277936 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281472 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-kubelet\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281514 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-netns\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281536 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/adf5bd36-b46b-4a06-8291-cae9f3988330-ovn-node-metrics-cert\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281563 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-config\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281611 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-script-lib\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281635 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-env-overrides\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281656 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nrjb\" (UniqueName: \"kubernetes.io/projected/adf5bd36-b46b-4a06-8291-cae9f3988330-kube-api-access-6nrjb\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281694 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-log-socket\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281728 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-etc-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-ovn\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281774 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281797 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-systemd\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281817 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-slash\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281878 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-netd\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281932 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-node-log\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281959 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-systemd-units\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.281980 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.282001 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-var-lib-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.282032 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-bin\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.294463 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.306063 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.318239 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.382645 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-etc-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.382734 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-ovn\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.382734 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-etc-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.382814 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-ovn\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.382754 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.382952 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-systemd\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.382982 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383006 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383024 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-systemd\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383020 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-slash\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383069 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-slash\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383081 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383098 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-netd\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383123 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-netd\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383133 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-node-log\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383154 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-systemd-units\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383187 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383205 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-var-lib-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383220 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-bin\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383238 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-kubelet\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383254 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-netns\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/adf5bd36-b46b-4a06-8291-cae9f3988330-ovn-node-metrics-cert\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383239 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-node-log\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383290 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-config\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383306 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-script-lib\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-bin\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383222 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-systemd-units\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-env-overrides\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383337 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nrjb\" (UniqueName: \"kubernetes.io/projected/adf5bd36-b46b-4a06-8291-cae9f3988330-kube-api-access-6nrjb\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383363 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-log-socket\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383400 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-log-socket\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383258 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-var-lib-openvswitch\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.383339 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-kubelet\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.384106 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-netns\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.384272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-script-lib\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.384278 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-env-overrides\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.384462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-config\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.387660 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/adf5bd36-b46b-4a06-8291-cae9f3988330-ovn-node-metrics-cert\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.397919 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nrjb\" (UniqueName: \"kubernetes.io/projected/adf5bd36-b46b-4a06-8291-cae9f3988330-kube-api-access-6nrjb\") pod \"ovnkube-node-4qphl\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.454990 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 00:33:46.495095331 +0000 UTC Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.511684 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.511795 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.511697 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.511874 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.512013 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.512132 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.537735 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:34 crc kubenswrapper[4795]: W0219 21:28:34.572591 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadf5bd36_b46b_4a06_8291_cae9f3988330.slice/crio-740cf96667070bca195c02d1eaa75a11754b5d3aaf7c73fedd2e4e883f6a4193 WatchSource:0}: Error finding container 740cf96667070bca195c02d1eaa75a11754b5d3aaf7c73fedd2e4e883f6a4193: Status 404 returned error can't find the container with id 740cf96667070bca195c02d1eaa75a11754b5d3aaf7c73fedd2e4e883f6a4193 Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.623022 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3" containerID="40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9" exitCode=0 Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.623105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" event={"ID":"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3","Type":"ContainerDied","Data":"40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.623193 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" event={"ID":"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3","Type":"ContainerStarted","Data":"06cc03bf1c3b39c6d3842b732a42963fbbed69cfc78269fbb0494e80f0536205"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.624225 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-blzsk" event={"ID":"e9643227-37ca-4e4a-b9bc-371b18d67edc","Type":"ContainerStarted","Data":"f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.624265 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-blzsk" event={"ID":"e9643227-37ca-4e4a-b9bc-371b18d67edc","Type":"ContainerStarted","Data":"8fd79f423625c1967ec4fdeb8bb0c4df88f46a9fc1c2c400d9a475409b01da2f"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.625575 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"740cf96667070bca195c02d1eaa75a11754b5d3aaf7c73fedd2e4e883f6a4193"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.627527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.627562 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.627576 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"d574fc53e044370889753eeb0c62a38953bf809ef1ccb2d1baf5e98adf4e9947"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.628619 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5p6d9" event={"ID":"e967392b-9bd8-4111-b1b9-96d503a19668","Type":"ContainerStarted","Data":"ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.628660 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5p6d9" event={"ID":"e967392b-9bd8-4111-b1b9-96d503a19668","Type":"ContainerStarted","Data":"4d4b0b683458da919192e168b950b3bebf6b1cfe448325e37e3777ebb0f2e9ff"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.647294 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.662497 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.673492 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.688546 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.699806 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.710375 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.711566 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.713540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.713577 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.713586 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.713719 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.720321 4795 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.720615 4795 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.721591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.721619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.721627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.721641 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.721652 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:34Z","lastTransitionTime":"2026-02-19T21:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.727122 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.741518 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.744550 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.744579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.744589 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.744606 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.744617 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:34Z","lastTransitionTime":"2026-02-19T21:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.746590 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.757046 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.757501 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.760331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.760418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.760435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.760450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.760542 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:34Z","lastTransitionTime":"2026-02-19T21:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.768241 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.772997 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.776747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.776783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.776792 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.776805 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.776815 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:34Z","lastTransitionTime":"2026-02-19T21:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.787600 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.789273 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.792471 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.792506 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.792519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.792536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.792550 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:34Z","lastTransitionTime":"2026-02-19T21:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.800286 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.803103 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: E0219 21:28:34.803530 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.805041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.805070 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.805080 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.805098 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.805111 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:34Z","lastTransitionTime":"2026-02-19T21:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.811828 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.822758 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.833394 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.843824 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.854996 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.872538 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.883049 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.892874 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.905432 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.906674 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.906720 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.906733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.906752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.906765 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:34Z","lastTransitionTime":"2026-02-19T21:28:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.917457 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.929139 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.944178 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.956312 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:34 crc kubenswrapper[4795]: I0219 21:28:34.968822 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:34Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.008993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.009030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.009041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.009056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.009068 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.111366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.111419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.111428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.111444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.111453 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.213759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.213796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.213807 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.213827 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.213839 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.315849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.316126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.316136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.316150 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.316158 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.419110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.419187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.419197 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.419215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.419225 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.455558 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 14:14:55.983096506 +0000 UTC Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.522009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.522053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.522065 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.522083 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.522095 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.624499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.624554 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.624574 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.624606 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.624623 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.633802 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3" containerID="30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e" exitCode=0 Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.633862 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" event={"ID":"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3","Type":"ContainerDied","Data":"30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.636328 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c" exitCode=0 Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.636494 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.670210 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.685386 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.694315 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.708489 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.721394 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.726593 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.726633 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.726650 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.726666 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.726682 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.736259 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.746195 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.757743 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.770755 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.790280 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.802690 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.813756 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.823607 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.828861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.828896 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.828904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.828919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.828928 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.832763 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.848646 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.860638 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.879502 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.891859 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.904205 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.913808 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.924733 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.929315 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.931079 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.931110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.931122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.931137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.931154 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:35Z","lastTransitionTime":"2026-02-19T21:28:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.933462 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.935069 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.935817 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.953395 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.963915 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.973326 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.983485 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.984752 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jvnv5"] Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.985149 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.986233 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.986488 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.986692 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.986910 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.995554 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:35Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.997440 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-host\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.997489 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89q2s\" (UniqueName: \"kubernetes.io/projected/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-kube-api-access-89q2s\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:35 crc kubenswrapper[4795]: I0219 21:28:35.997518 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-serviceca\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.032199 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.033359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.033388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.033400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.033416 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.033429 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.067371 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.098861 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-host\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.098931 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89q2s\" (UniqueName: \"kubernetes.io/projected/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-kube-api-access-89q2s\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.098934 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-host\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.098962 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-serviceca\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.099956 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-serviceca\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.107673 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.137081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.137118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.137128 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.137146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.137147 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89q2s\" (UniqueName: \"kubernetes.io/projected/93ec0ce2-79c0-42a4-88e2-71065ec8ff9f-kube-api-access-89q2s\") pod \"node-ca-jvnv5\" (UID: \"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\") " pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.137158 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.171874 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.199736 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.199947 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:28:44.199927394 +0000 UTC m=+35.392445258 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.207847 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.240394 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.240427 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.240436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.240451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.240460 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.250413 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.289754 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.297000 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jvnv5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.300672 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.300715 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.300757 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.300784 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300868 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300882 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300899 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300929 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:44.300913068 +0000 UTC m=+35.493430932 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300905 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300950 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:44.300941119 +0000 UTC m=+35.493458983 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300953 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300959 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.301008 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.301023 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.300987 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:44.30097168 +0000 UTC m=+35.493489534 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.301093 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:28:44.301081393 +0000 UTC m=+35.493599257 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:36 crc kubenswrapper[4795]: W0219 21:28:36.308543 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93ec0ce2_79c0_42a4_88e2_71065ec8ff9f.slice/crio-d2a3fea10dcf077753d86894e22ca3d47666d60b886c420c35b0cd17318add40 WatchSource:0}: Error finding container d2a3fea10dcf077753d86894e22ca3d47666d60b886c420c35b0cd17318add40: Status 404 returned error can't find the container with id d2a3fea10dcf077753d86894e22ca3d47666d60b886c420c35b0cd17318add40 Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.329858 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.342719 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.342771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.342785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.342804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.342816 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.376016 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.408086 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.445812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.445846 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.445855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.445869 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.445877 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.448409 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.455818 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 07:59:09.9220675 +0000 UTC Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.488722 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.510950 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.511049 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.511107 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.511101 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.511157 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:36 crc kubenswrapper[4795]: E0219 21:28:36.511308 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.529186 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.547767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.547799 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.547807 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.547821 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.547831 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.572789 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.608866 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.640990 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jvnv5" event={"ID":"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f","Type":"ContainerStarted","Data":"558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.641029 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jvnv5" event={"ID":"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f","Type":"ContainerStarted","Data":"d2a3fea10dcf077753d86894e22ca3d47666d60b886c420c35b0cd17318add40"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.643958 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.643991 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.644005 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.644017 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.644027 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.644036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.646118 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.646309 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3" containerID="c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561" exitCode=0 Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.646339 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" event={"ID":"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3","Type":"ContainerDied","Data":"c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.649239 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.649269 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.649282 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.649295 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.649306 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.687733 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.731338 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.751129 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.751191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.751206 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.751223 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.751234 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.766724 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.811330 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.848341 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.852923 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.852946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.852954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.852969 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.852980 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.890430 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.926906 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.955397 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.955432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.955449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.955464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.955473 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:36Z","lastTransitionTime":"2026-02-19T21:28:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:36 crc kubenswrapper[4795]: I0219 21:28:36.972208 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.009621 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.050135 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.058536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.058596 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.058618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.058646 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.058670 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.092602 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.133005 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.161216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.161269 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.161281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.161299 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.161310 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.168800 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.210136 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.249660 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.263874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.263922 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.263936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.263955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.263968 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.290944 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.330365 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.366095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.366142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.366155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.366187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.366197 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.369956 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.414728 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.450136 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.456323 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 05:54:06.083236843 +0000 UTC Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.469075 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.469118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.469129 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.469146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.469157 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.489798 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.528490 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.571979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.572036 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.572060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.572089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.572110 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.575106 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.611942 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.652498 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3" containerID="f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf" exitCode=0 Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.652551 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" event={"ID":"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3","Type":"ContainerDied","Data":"f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.665879 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.675215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.675240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.675250 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.675264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.675274 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.688657 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.732185 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.769855 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.777850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.777875 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.777882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.777897 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.777906 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.809618 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.855742 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.880131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.880187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.880198 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.880216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.880227 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.889464 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.928971 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.967517 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:37Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.982636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.982671 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.982684 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.982700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:37 crc kubenswrapper[4795]: I0219 21:28:37.982712 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:37Z","lastTransitionTime":"2026-02-19T21:28:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.011435 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.050108 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.084967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.084998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.085008 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.085024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.085037 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.127444 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.144513 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.173244 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.187527 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.187568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.187579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.187597 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.187609 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.210410 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.262490 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.288399 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.289520 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.289564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.289579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.289597 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.289609 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.330952 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.392041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.392073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.392084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.392098 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.392107 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.456603 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 06:53:23.93784557 +0000 UTC Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.494705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.494755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.494769 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.494788 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.494801 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.510953 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.510983 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.510953 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:38 crc kubenswrapper[4795]: E0219 21:28:38.511096 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:38 crc kubenswrapper[4795]: E0219 21:28:38.511153 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:38 crc kubenswrapper[4795]: E0219 21:28:38.511268 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.597145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.597198 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.597207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.597220 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.597232 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.658431 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3" containerID="649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094" exitCode=0 Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.658499 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" event={"ID":"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3","Type":"ContainerDied","Data":"649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.662900 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.674872 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.691694 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.699032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.699102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.699119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.699145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.699210 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.702532 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.728658 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.749659 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.764538 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.779189 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.789360 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.801782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.801822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.801831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.801845 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.801879 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.801856 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.811990 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.821229 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.833380 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.848392 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.886467 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.904157 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.904214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.904222 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.904239 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.904248 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:38Z","lastTransitionTime":"2026-02-19T21:28:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:38 crc kubenswrapper[4795]: I0219 21:28:38.935923 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:38Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.006392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.006431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.006442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.006455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.006463 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.108357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.108386 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.108396 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.108409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.108419 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.210916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.210940 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.210948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.210960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.210968 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.290399 4795 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.314015 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.314052 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.314061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.314075 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.314083 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.421828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.421861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.421874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.421890 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.421901 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.457569 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 15:40:52.051265018 +0000 UTC Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.524885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.524926 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.524935 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.524950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.524961 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.528786 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.540100 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.553306 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.564727 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.590779 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.605727 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.622243 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.627791 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.627825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.627835 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.627852 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.627863 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.634912 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.646357 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.668018 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3" containerID="01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72" exitCode=0 Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.668155 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" event={"ID":"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3","Type":"ContainerDied","Data":"01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.680836 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.698002 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.710318 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.725269 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.730678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.730718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.730728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.730744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.730756 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.743365 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.756143 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.768117 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.779807 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.792509 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.804063 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.815227 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.824462 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.833030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.833067 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.833077 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.833092 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.833106 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.839864 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.853019 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.886420 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.935027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.935079 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.935092 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.935106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.935116 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:39Z","lastTransitionTime":"2026-02-19T21:28:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.939871 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:39 crc kubenswrapper[4795]: I0219 21:28:39.965645 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.007827 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.038521 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.038568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.038581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.038599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.038612 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.046391 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.088117 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.133991 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.140676 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.140728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.140743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.140764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.140779 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.242986 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.243020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.243028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.243042 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.243052 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.346628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.346672 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.346682 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.346702 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.346713 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.449304 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.449333 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.449343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.449355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.449364 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.458655 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 23:37:41.334687827 +0000 UTC Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.510652 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.510686 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.510714 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:40 crc kubenswrapper[4795]: E0219 21:28:40.510892 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:40 crc kubenswrapper[4795]: E0219 21:28:40.511018 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:40 crc kubenswrapper[4795]: E0219 21:28:40.511274 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.552656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.552730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.552754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.552786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.552809 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.655484 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.656020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.656037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.656056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.656068 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.677372 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.682460 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" event={"ID":"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3","Type":"ContainerStarted","Data":"47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.717617 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.739869 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.758128 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.758177 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.758188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.758202 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.758211 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.761809 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.774545 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.793455 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.805367 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.814150 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.826273 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.836658 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.846219 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.859894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.859922 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.859933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.859945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.859954 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.864992 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.876651 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.885410 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.903599 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.915215 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.919076 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.931871 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.952473 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.962673 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.962716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.962725 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.962739 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.962748 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:40Z","lastTransitionTime":"2026-02-19T21:28:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.965685 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.980035 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:40 crc kubenswrapper[4795]: I0219 21:28:40.994229 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:40Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.008099 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.024927 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.051293 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.064694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.064759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.064777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.064800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.064819 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.092698 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.132473 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.167383 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.167430 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.167444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.167462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.167475 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.177206 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.207805 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.249476 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.270446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.270486 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.270497 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.270513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.270524 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.292448 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.328012 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.372501 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.372533 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.372541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.372556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.372567 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.458874 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 18:58:14.291355369 +0000 UTC Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.476498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.476563 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.476588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.476617 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.476639 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.579281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.579348 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.579367 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.579392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.579410 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.682154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.682241 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.682259 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.682283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.682305 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.685375 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.685451 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.702449 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.713999 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.715057 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.727716 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.745532 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.767744 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.785815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.785921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.785952 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.785981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.786007 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.800314 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.819743 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.839348 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.856033 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.879961 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.889042 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.889081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.889090 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.889141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.889153 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.897903 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.913663 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.930381 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.942431 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.952302 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.972585 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.986009 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:41Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.991772 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.991815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.991827 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.991844 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:41 crc kubenswrapper[4795]: I0219 21:28:41.991857 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:41Z","lastTransitionTime":"2026-02-19T21:28:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.005833 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.058369 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.089685 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.094270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.094317 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.094333 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.094355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.094372 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.129723 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.171380 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.197148 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.197233 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.197253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.197274 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.197289 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.211414 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.247939 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.295052 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.300069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.300121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.300133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.300150 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.300176 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.331898 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.369325 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.402823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.402863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.402874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.402892 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.402907 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.410904 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.447791 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.459113 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 06:55:26.423708245 +0000 UTC Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.497609 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.505087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.505123 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.505132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.505147 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.505158 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.511460 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.511476 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.511594 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:42 crc kubenswrapper[4795]: E0219 21:28:42.511707 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:42 crc kubenswrapper[4795]: E0219 21:28:42.511857 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:42 crc kubenswrapper[4795]: E0219 21:28:42.511987 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.530940 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:42Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.607855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.607892 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.607903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.607919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.607931 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.687422 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.711034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.711073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.711085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.711110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.711121 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.813365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.813416 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.813431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.813455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.813472 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.916060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.916116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.916125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.916139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:42 crc kubenswrapper[4795]: I0219 21:28:42.916150 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:42Z","lastTransitionTime":"2026-02-19T21:28:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.018627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.018665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.018675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.018691 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.018730 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.121066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.121099 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.121108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.121121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.121130 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.223031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.223077 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.223085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.223100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.223110 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.326393 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.326444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.326456 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.326473 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.326488 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.428796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.429000 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.429060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.429132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.429230 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.460187 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 13:50:48.413691279 +0000 UTC Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.531826 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.532014 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.532073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.532146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.532233 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.634313 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.634361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.634377 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.634398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.634413 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.691749 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/0.log" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.694369 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359" exitCode=1 Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.694412 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.695123 4795 scope.go:117] "RemoveContainer" containerID="2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.716620 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.728104 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.737114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.737270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.737362 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.737428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.737508 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.740959 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.755938 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.774142 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\" 6105 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 21:28:43.391192 6105 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 21:28:43.391243 6105 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 21:28:43.391249 6105 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 21:28:43.391260 6105 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 21:28:43.391273 6105 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 21:28:43.391288 6105 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 21:28:43.391303 6105 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0219 21:28:43.391308 6105 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0219 21:28:43.391308 6105 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 21:28:43.391320 6105 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 21:28:43.391330 6105 factory.go:656] Stopping watch factory\\\\nI0219 21:28:43.391334 6105 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 21:28:43.391323 6105 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 21:28:43.391350 6105 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0219 21:28:43.391356 6105 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.787948 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.799376 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.812717 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.824398 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.840306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.840344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.840356 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.840373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.840385 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.851562 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.865002 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.876485 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.894618 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.909025 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.921117 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:43Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.943141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.943219 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.943237 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.943263 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:43 crc kubenswrapper[4795]: I0219 21:28:43.943280 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:43Z","lastTransitionTime":"2026-02-19T21:28:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.045087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.045118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.045126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.045139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.045148 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.147649 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.147685 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.147696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.147712 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.147720 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.249527 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.249558 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.249567 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.249580 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.249588 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.281419 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.281614 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:00.281597885 +0000 UTC m=+51.474115749 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.351241 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.351281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.351291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.351303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.351312 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.382276 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.382328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.382359 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.382381 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382417 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382441 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382467 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382476 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:00.382459995 +0000 UTC m=+51.574977859 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382485 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382491 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:00.382485476 +0000 UTC m=+51.575003340 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382495 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382494 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382543 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:00.382534908 +0000 UTC m=+51.575052772 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382546 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382565 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.382628 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:00.38261461 +0000 UTC m=+51.575132474 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.453187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.453226 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.453236 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.453253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.453266 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.460439 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 21:11:09.56090668 +0000 UTC Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.511051 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.511137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.511051 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.511214 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.511263 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.511341 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.555640 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.555684 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.555699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.555715 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.555726 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.658019 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.658056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.658067 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.658084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.658094 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.697525 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/1.log" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.698058 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/0.log" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.700267 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517" exitCode=1 Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.700299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.700338 4795 scope.go:117] "RemoveContainer" containerID="2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.700878 4795 scope.go:117] "RemoveContainer" containerID="175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517" Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.701045 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.714263 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.723901 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.736300 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.747397 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.756336 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.760013 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.760044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.760055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.760072 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.760084 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.770062 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.780494 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.787736 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.805762 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.816264 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.826683 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.836886 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.847235 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.855790 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.855837 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.855846 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.855860 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.855869 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.857331 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.866434 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.869093 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.869116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.869125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.869138 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.869148 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.875857 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a0af6987edcb00218a308872d80fd4728d48000c0454e675a8f68bf30dc2359\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:43Z\\\",\\\"message\\\":\\\" 6105 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 21:28:43.391192 6105 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 21:28:43.391243 6105 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 21:28:43.391249 6105 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 21:28:43.391260 6105 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 21:28:43.391273 6105 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 21:28:43.391288 6105 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 21:28:43.391303 6105 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0219 21:28:43.391308 6105 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0219 21:28:43.391308 6105 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 21:28:43.391320 6105 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 21:28:43.391330 6105 factory.go:656] Stopping watch factory\\\\nI0219 21:28:43.391334 6105 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 21:28:43.391323 6105 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 21:28:43.391350 6105 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0219 21:28:43.391356 6105 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.878839 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.881968 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.881992 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.882003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.882017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.882026 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.892231 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.894954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.894980 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.894988 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.894998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.895005 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.904789 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.908085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.908115 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.908124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.908138 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.908147 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.917640 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:44Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:44 crc kubenswrapper[4795]: E0219 21:28:44.917745 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.918967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.918995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.919004 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.919018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:44 crc kubenswrapper[4795]: I0219 21:28:44.919026 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:44Z","lastTransitionTime":"2026-02-19T21:28:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.021449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.021499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.021509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.021522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.021530 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.124320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.124362 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.124373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.124391 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.124405 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.226224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.226444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.226548 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.226636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.226729 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.328993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.329033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.329044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.329058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.329068 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.431052 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.431082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.431089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.431102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.431110 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.461525 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 21:57:45.846761219 +0000 UTC Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.534682 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.534756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.534777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.534805 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.534827 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.637725 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.637791 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.637806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.637831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.637849 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.708430 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/1.log" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.713523 4795 scope.go:117] "RemoveContainer" containerID="175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517" Feb 19 21:28:45 crc kubenswrapper[4795]: E0219 21:28:45.713744 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.735698 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.740390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.740429 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.740443 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.740464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.740478 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.754641 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.769469 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.782475 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.801130 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.812392 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.835007 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.843084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.843126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.843139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.843156 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.843196 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.855043 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.868255 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.887001 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.901952 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.914479 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz"] Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.914931 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.916891 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.916996 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.917491 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.942125 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.945492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.945552 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.945566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.945581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.945590 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:45Z","lastTransitionTime":"2026-02-19T21:28:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.961353 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.972941 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.981460 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.999595 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67ec5415-424e-40b7-9beb-171cd1f3dbe9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.999706 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67ec5415-424e-40b7-9beb-171cd1f3dbe9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.999866 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67ec5415-424e-40b7-9beb-171cd1f3dbe9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:45 crc kubenswrapper[4795]: I0219 21:28:45.999930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmblv\" (UniqueName: \"kubernetes.io/projected/67ec5415-424e-40b7-9beb-171cd1f3dbe9-kube-api-access-kmblv\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.000695 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:45Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.015305 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.026501 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.047440 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.047480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.047492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.047510 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.047523 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.053240 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.069991 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.101530 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67ec5415-424e-40b7-9beb-171cd1f3dbe9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.101679 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67ec5415-424e-40b7-9beb-171cd1f3dbe9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.101717 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmblv\" (UniqueName: \"kubernetes.io/projected/67ec5415-424e-40b7-9beb-171cd1f3dbe9-kube-api-access-kmblv\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.101757 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67ec5415-424e-40b7-9beb-171cd1f3dbe9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.102676 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67ec5415-424e-40b7-9beb-171cd1f3dbe9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.102691 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67ec5415-424e-40b7-9beb-171cd1f3dbe9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.106681 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67ec5415-424e-40b7-9beb-171cd1f3dbe9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.119941 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.136652 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.141695 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmblv\" (UniqueName: \"kubernetes.io/projected/67ec5415-424e-40b7-9beb-171cd1f3dbe9-kube-api-access-kmblv\") pod \"ovnkube-control-plane-749d76644c-nfpbz\" (UID: \"67ec5415-424e-40b7-9beb-171cd1f3dbe9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.150613 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.150670 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.150683 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.150700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.150712 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.151536 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.163680 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.179343 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.190029 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.203310 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.215789 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.228812 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.232477 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: W0219 21:28:46.239576 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67ec5415_424e_40b7_9beb_171cd1f3dbe9.slice/crio-46647e0b4c0675fefd39ff37753ffb51485edbf8b70456292a38fb7676ee7efd WatchSource:0}: Error finding container 46647e0b4c0675fefd39ff37753ffb51485edbf8b70456292a38fb7676ee7efd: Status 404 returned error can't find the container with id 46647e0b4c0675fefd39ff37753ffb51485edbf8b70456292a38fb7676ee7efd Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.245981 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.253038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.253076 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.253086 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.253100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.253110 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.355663 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.355700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.355709 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.355726 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.355735 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.492962 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 09:16:55.111097551 +0000 UTC Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.495716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.495784 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.495800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.495823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.495836 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.511631 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:46 crc kubenswrapper[4795]: E0219 21:28:46.511751 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.511810 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:46 crc kubenswrapper[4795]: E0219 21:28:46.511937 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.512929 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:46 crc kubenswrapper[4795]: E0219 21:28:46.513049 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.597849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.597896 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.597912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.597935 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.597954 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.638333 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.700044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.700085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.700096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.700115 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.700126 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.716539 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" event={"ID":"67ec5415-424e-40b7-9beb-171cd1f3dbe9","Type":"ContainerStarted","Data":"ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.716588 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" event={"ID":"67ec5415-424e-40b7-9beb-171cd1f3dbe9","Type":"ContainerStarted","Data":"230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.716601 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" event={"ID":"67ec5415-424e-40b7-9beb-171cd1f3dbe9","Type":"ContainerStarted","Data":"46647e0b4c0675fefd39ff37753ffb51485edbf8b70456292a38fb7676ee7efd"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.717105 4795 scope.go:117] "RemoveContainer" containerID="175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517" Feb 19 21:28:46 crc kubenswrapper[4795]: E0219 21:28:46.717260 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.736318 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.746701 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.756256 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.768480 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.778906 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.787809 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.797576 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.802431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.802461 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.802470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.802483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.802491 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.808991 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.819417 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.829996 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.853904 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.864634 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.882418 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.895475 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.904906 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.904936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.904946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.904963 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.904974 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:46Z","lastTransitionTime":"2026-02-19T21:28:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.908813 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:46 crc kubenswrapper[4795]: I0219 21:28:46.918293 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:46Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.008055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.008116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.008134 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.008160 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.008250 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.111085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.111123 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.111132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.111146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.111155 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.214329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.214373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.214381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.214399 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.214409 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.316600 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.316656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.316665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.316678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.316689 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.369223 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ff4bs"] Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.369647 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:47 crc kubenswrapper[4795]: E0219 21:28:47.369700 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.383932 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.414583 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.416111 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.416233 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw77f\" (UniqueName: \"kubernetes.io/projected/1b1b4346-e02e-4614-b2ff-e4628046a92f-kube-api-access-rw77f\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.418671 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.418725 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.418739 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.418757 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.418770 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.431596 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.444829 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.461349 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.479656 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.490500 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.493974 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 10:13:56.151111468 +0000 UTC Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.503610 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.517051 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.517534 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.517586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw77f\" (UniqueName: \"kubernetes.io/projected/1b1b4346-e02e-4614-b2ff-e4628046a92f-kube-api-access-rw77f\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:47 crc kubenswrapper[4795]: E0219 21:28:47.517684 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:47 crc kubenswrapper[4795]: E0219 21:28:47.517772 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs podName:1b1b4346-e02e-4614-b2ff-e4628046a92f nodeName:}" failed. No retries permitted until 2026-02-19 21:28:48.017749655 +0000 UTC m=+39.210267589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs") pod "network-metrics-daemon-ff4bs" (UID: "1b1b4346-e02e-4614-b2ff-e4628046a92f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.521212 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.521256 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.521268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.521283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.521293 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.534089 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.541636 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw77f\" (UniqueName: \"kubernetes.io/projected/1b1b4346-e02e-4614-b2ff-e4628046a92f-kube-api-access-rw77f\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.550880 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.569565 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.585353 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.599759 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.613019 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.623250 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.623312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.623332 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.623358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.623377 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.628907 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.641626 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:47Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.725069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.725102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.725112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.725124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.725138 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.826934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.826981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.826989 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.827007 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.827016 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.929439 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.929470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.929479 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.929492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:47 crc kubenswrapper[4795]: I0219 21:28:47.929501 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:47Z","lastTransitionTime":"2026-02-19T21:28:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.022841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:48 crc kubenswrapper[4795]: E0219 21:28:48.022961 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:48 crc kubenswrapper[4795]: E0219 21:28:48.023013 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs podName:1b1b4346-e02e-4614-b2ff-e4628046a92f nodeName:}" failed. No retries permitted until 2026-02-19 21:28:49.022999035 +0000 UTC m=+40.215516889 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs") pod "network-metrics-daemon-ff4bs" (UID: "1b1b4346-e02e-4614-b2ff-e4628046a92f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.031842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.031878 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.031887 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.031899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.031908 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.134122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.134176 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.134187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.134202 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.134211 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.236244 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.236284 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.236292 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.236305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.236315 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.339018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.339046 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.339056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.339070 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.339079 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.440985 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.441042 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.441063 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.441092 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.441114 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.494264 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 16:28:20.792601536 +0000 UTC Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.510853 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.510907 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.510867 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:48 crc kubenswrapper[4795]: E0219 21:28:48.511120 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:48 crc kubenswrapper[4795]: E0219 21:28:48.511229 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:48 crc kubenswrapper[4795]: E0219 21:28:48.511302 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.543528 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.543568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.543578 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.543594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.543605 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.645823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.645895 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.645920 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.645949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.645972 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.748981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.749031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.749043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.749062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.749074 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.851645 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.851694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.851704 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.851718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.851726 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.953795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.953825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.953833 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.953847 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:48 crc kubenswrapper[4795]: I0219 21:28:48.953857 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:48Z","lastTransitionTime":"2026-02-19T21:28:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.031613 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:49 crc kubenswrapper[4795]: E0219 21:28:49.031794 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:49 crc kubenswrapper[4795]: E0219 21:28:49.031854 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs podName:1b1b4346-e02e-4614-b2ff-e4628046a92f nodeName:}" failed. No retries permitted until 2026-02-19 21:28:51.031836298 +0000 UTC m=+42.224354162 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs") pod "network-metrics-daemon-ff4bs" (UID: "1b1b4346-e02e-4614-b2ff-e4628046a92f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.055631 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.055664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.055674 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.055687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.055695 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.158041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.158081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.158091 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.158108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.158120 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.260504 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.260535 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.260543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.260556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.260566 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.363106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.363137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.363146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.363158 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.363183 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.464887 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.464919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.464928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.464941 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.464949 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.494913 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 13:42:06.836952082 +0000 UTC Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.511297 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:49 crc kubenswrapper[4795]: E0219 21:28:49.511412 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.531178 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.548937 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.567667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.567983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.568104 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.568259 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.568368 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.574878 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.593561 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.617063 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.630300 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.643032 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.656487 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.669642 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.672035 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.672092 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.672118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.672148 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.672202 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.686279 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.703976 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.716505 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.734641 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.755537 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.771371 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.774820 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.774842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.774850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.774863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.774872 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.786511 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.796263 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:49Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.877359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.877390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.877400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.877415 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.877426 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.979824 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.979866 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.979877 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.979894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:49 crc kubenswrapper[4795]: I0219 21:28:49.979905 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:49Z","lastTransitionTime":"2026-02-19T21:28:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.082731 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.082794 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.082812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.082834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.082849 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.185400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.185727 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.185823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.185964 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.186062 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.289223 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.289265 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.289281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.289304 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.289321 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.392281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.392334 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.392359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.392385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.392406 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.495000 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.495025 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.495032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.495045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.494997 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 00:42:39.723753432 +0000 UTC Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.495053 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.511527 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.511544 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:50 crc kubenswrapper[4795]: E0219 21:28:50.511605 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.511612 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:50 crc kubenswrapper[4795]: E0219 21:28:50.511709 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:50 crc kubenswrapper[4795]: E0219 21:28:50.511791 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.597822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.597880 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.597903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.597931 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.597952 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.700248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.700301 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.700321 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.700344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.700362 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.803239 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.803275 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.803288 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.803307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.803319 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.906445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.906487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.906499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.906522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:50 crc kubenswrapper[4795]: I0219 21:28:50.906534 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:50Z","lastTransitionTime":"2026-02-19T21:28:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.009131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.009180 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.009193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.009210 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.009223 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.050504 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:51 crc kubenswrapper[4795]: E0219 21:28:51.050676 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:51 crc kubenswrapper[4795]: E0219 21:28:51.050730 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs podName:1b1b4346-e02e-4614-b2ff-e4628046a92f nodeName:}" failed. No retries permitted until 2026-02-19 21:28:55.050713337 +0000 UTC m=+46.243231211 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs") pod "network-metrics-daemon-ff4bs" (UID: "1b1b4346-e02e-4614-b2ff-e4628046a92f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.111741 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.111792 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.111804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.111820 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.111831 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.214958 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.215010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.215029 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.215047 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.215059 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.317829 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.318063 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.318127 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.318225 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.318295 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.420375 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.420412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.420421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.420436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.420446 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.495321 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 06:21:17.996560819 +0000 UTC Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.510874 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:51 crc kubenswrapper[4795]: E0219 21:28:51.511052 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.521989 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.522028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.522040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.522056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.522068 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.624077 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.624104 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.624120 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.624139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.624149 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.726771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.726820 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.726833 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.726852 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.726864 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.828919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.828949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.828957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.828970 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.828979 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.931233 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.931282 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.931294 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.931312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:51 crc kubenswrapper[4795]: I0219 21:28:51.931327 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:51Z","lastTransitionTime":"2026-02-19T21:28:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.033133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.033525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.033627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.033716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.033810 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.136657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.136702 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.136709 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.136721 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.136730 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.238707 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.238958 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.239049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.239135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.239289 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.340952 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.340994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.341003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.341017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.341026 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.443335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.443474 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.443502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.443529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.443550 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.495977 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 14:15:34.803411244 +0000 UTC Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.511368 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.511372 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:52 crc kubenswrapper[4795]: E0219 21:28:52.511838 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.511390 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:52 crc kubenswrapper[4795]: E0219 21:28:52.511691 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:52 crc kubenswrapper[4795]: E0219 21:28:52.512020 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.545921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.545954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.545962 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.545977 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.545987 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.648422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.648450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.648458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.648470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.648479 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.750753 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.750780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.750788 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.750800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.750809 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.853514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.853815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.853915 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.854022 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.854147 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.957190 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.957226 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.957235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.957258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:52 crc kubenswrapper[4795]: I0219 21:28:52.957269 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:52Z","lastTransitionTime":"2026-02-19T21:28:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.064478 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.064710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.064785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.064864 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.064929 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.166687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.166740 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.166752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.166764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.166772 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.268684 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.268925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.269003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.269075 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.269145 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.371453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.371698 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.371801 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.371885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.371963 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.474432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.474689 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.474758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.474834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.474902 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.496546 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 03:19:07.853274193 +0000 UTC Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.511054 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:53 crc kubenswrapper[4795]: E0219 21:28:53.511346 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.577084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.577153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.577205 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.577230 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.577248 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.680838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.680894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.680906 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.680921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.680932 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.783290 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.783350 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.783368 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.783391 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.783411 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.886857 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.886897 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.886911 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.886928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.886940 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.989871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.989931 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.989950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.989975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:53 crc kubenswrapper[4795]: I0219 21:28:53.989996 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:53Z","lastTransitionTime":"2026-02-19T21:28:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.092702 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.092782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.092804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.092832 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.092853 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.196303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.196352 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.196364 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.196382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.196394 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.298729 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.298798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.298815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.298837 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.298853 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.401775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.402044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.402109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.402205 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.402284 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.497160 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 01:27:12.140175061 +0000 UTC Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.505283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.505320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.505336 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.505358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.505374 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.510794 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.510917 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:54 crc kubenswrapper[4795]: E0219 21:28:54.511084 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.510794 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:54 crc kubenswrapper[4795]: E0219 21:28:54.511299 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:54 crc kubenswrapper[4795]: E0219 21:28:54.510944 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.607572 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.607605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.607617 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.607632 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.607643 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.710564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.710630 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.710656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.710684 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.710707 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.814048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.814094 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.814109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.814128 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.814144 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.917686 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.917748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.917765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.917795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:54 crc kubenswrapper[4795]: I0219 21:28:54.917813 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:54Z","lastTransitionTime":"2026-02-19T21:28:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.021129 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.021211 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.021230 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.021252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.021275 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.056274 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.056315 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.056330 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.056349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.056363 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.078441 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:55Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.082699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.082737 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.082749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.082766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.082778 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.086900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.087065 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.087217 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs podName:1b1b4346-e02e-4614-b2ff-e4628046a92f nodeName:}" failed. No retries permitted until 2026-02-19 21:29:03.087139168 +0000 UTC m=+54.279657072 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs") pod "network-metrics-daemon-ff4bs" (UID: "1b1b4346-e02e-4614-b2ff-e4628046a92f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.103257 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:55Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.107607 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.107687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.107699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.107715 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.107726 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.123605 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:55Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.127126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.127208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.127234 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.127257 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.127275 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.141400 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:55Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.145032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.145072 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.145088 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.145106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.145120 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.158939 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:55Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.159107 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.161214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.161390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.161483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.161587 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.161670 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.264113 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.264154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.264175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.264192 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.264203 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.367744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.367793 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.367804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.367825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.367837 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.470484 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.470524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.470535 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.470552 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.470565 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.498308 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 07:42:28.506358376 +0000 UTC Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.511690 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:55 crc kubenswrapper[4795]: E0219 21:28:55.511842 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.573620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.573681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.573700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.573731 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.573749 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.676757 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.676792 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.676800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.676813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.676822 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.780076 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.780114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.780126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.780143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.780156 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.883247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.883307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.883331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.883359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.883381 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.985580 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.985650 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.985671 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.985695 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:55 crc kubenswrapper[4795]: I0219 21:28:55.985713 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:55Z","lastTransitionTime":"2026-02-19T21:28:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.088724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.088796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.088825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.088855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.088876 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.191622 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.191695 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.191732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.191767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.191788 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.294319 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.294397 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.294420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.294450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.294472 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.397870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.397950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.397973 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.398053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.398461 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.498601 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 20:01:50.853028521 +0000 UTC Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.501327 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.501398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.501423 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.501453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.501474 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.511273 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.511305 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.511305 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:56 crc kubenswrapper[4795]: E0219 21:28:56.511466 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:56 crc kubenswrapper[4795]: E0219 21:28:56.511638 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:56 crc kubenswrapper[4795]: E0219 21:28:56.511773 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.603691 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.603753 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.603777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.603807 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.603829 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.707458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.707517 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.707533 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.707558 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.707601 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.810426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.810479 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.810495 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.810518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.810538 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.913599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.913664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.913688 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.913716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:56 crc kubenswrapper[4795]: I0219 21:28:56.913842 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:56Z","lastTransitionTime":"2026-02-19T21:28:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.021362 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.021423 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.021433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.021446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.021455 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.125065 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.125225 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.125252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.125282 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.125717 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.228594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.228664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.228688 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.228720 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.228742 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.332540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.332591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.332607 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.332628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.332644 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.436120 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.436210 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.436229 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.436255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.436300 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.499401 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 13:33:57.459071703 +0000 UTC Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.510610 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:57 crc kubenswrapper[4795]: E0219 21:28:57.510835 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.539356 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.539424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.539444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.539473 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.539491 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.642198 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.642255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.642272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.642296 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.642317 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.745712 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.745842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.745879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.745898 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.745911 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.848815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.848871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.848889 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.848918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.848935 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.952280 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.952349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.952370 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.952400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:57 crc kubenswrapper[4795]: I0219 21:28:57.952422 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:57Z","lastTransitionTime":"2026-02-19T21:28:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.055289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.055335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.055373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.055392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.055403 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.158917 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.158996 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.159012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.159065 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.159082 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.262513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.262582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.262598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.262620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.262634 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.365326 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.365381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.365397 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.365420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.365438 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.468412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.468483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.468500 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.468529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.468551 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.500009 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 00:32:32.094501688 +0000 UTC Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.511395 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.511490 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:28:58 crc kubenswrapper[4795]: E0219 21:28:58.511613 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:28:58 crc kubenswrapper[4795]: E0219 21:28:58.512252 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.512779 4795 scope.go:117] "RemoveContainer" containerID="175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.513333 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:28:58 crc kubenswrapper[4795]: E0219 21:28:58.513488 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.572519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.572785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.572796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.572843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.572856 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.675560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.675601 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.675611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.675627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.675639 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.755405 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/1.log" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.760139 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.760796 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.775538 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.778687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.778740 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.778758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.778780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.778798 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.789698 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.812647 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.824689 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.838075 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.851049 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.863746 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.877368 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.881109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.881309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.881430 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.881547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.881662 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.890839 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.902549 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.927406 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.944125 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.956377 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.972266 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.983714 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.983765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.983777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.983796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.983808 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:58Z","lastTransitionTime":"2026-02-19T21:28:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:58 crc kubenswrapper[4795]: I0219 21:28:58.991158 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.001405 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:58Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.012062 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.086377 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.086607 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.086667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.086736 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.086837 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.189465 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.189497 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.189508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.189523 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.189535 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.291547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.291858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.291939 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.292012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.292085 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.394359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.394418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.394459 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.394483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.394499 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.496755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.496819 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.496836 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.496862 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.496880 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.501015 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 16:28:43.243495386 +0000 UTC Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.511578 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:28:59 crc kubenswrapper[4795]: E0219 21:28:59.511756 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.525234 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.539681 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.552572 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.566077 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.587841 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.599248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.599459 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.599658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.599857 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.600035 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.602338 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.615977 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.629986 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.642073 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.654570 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.669357 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.693895 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.703317 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.703371 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.703381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.703397 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.703410 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.727325 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.742813 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.754461 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.764583 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.764699 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/2.log" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.765340 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/1.log" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.768850 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0" exitCode=1 Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.768933 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.769103 4795 scope.go:117] "RemoveContainer" containerID="175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.769403 4795 scope.go:117] "RemoveContainer" containerID="b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0" Feb 19 21:28:59 crc kubenswrapper[4795]: E0219 21:28:59.769535 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.785713 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.796379 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.805406 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.805700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.805725 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.805734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.805749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.805777 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.821154 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.829993 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.840122 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.851744 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.863815 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.878689 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.891129 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.900365 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.911908 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.911958 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.911969 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.911984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.911995 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:28:59Z","lastTransitionTime":"2026-02-19T21:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.916958 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.928109 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.937522 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.953711 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.963606 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.973243 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:28:59 crc kubenswrapper[4795]: I0219 21:28:59.988585 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.014568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.014610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.014619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.014633 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.014643 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.116657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.116694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.116703 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.116717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.116727 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.219155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.219208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.219218 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.219235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.219249 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.320999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.321043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.321056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.321072 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.321083 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.345331 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.345545 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:29:32.345520516 +0000 UTC m=+83.538038390 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.423775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.423816 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.423825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.423840 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.423850 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.446051 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.446139 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446234 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446312 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:32.446293354 +0000 UTC m=+83.638811208 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.446306 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.446372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446487 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446536 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446556 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446484 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446569 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446614 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446629 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:32.446605593 +0000 UTC m=+83.639123527 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446638 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446655 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:32.446644834 +0000 UTC m=+83.639162818 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.446727 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:29:32.446700966 +0000 UTC m=+83.639218870 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.501365 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 16:44:03.167294156 +0000 UTC Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.511040 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.511040 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.511260 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.511062 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.511336 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.511368 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.526706 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.526771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.526783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.526801 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.526813 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.629711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.629756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.629774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.629796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.629813 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.630133 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.643549 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.646693 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.668071 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.690540 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.715687 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.732825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.732883 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.732905 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.732935 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.732958 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.742252 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.764771 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.778512 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/2.log" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.784563 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.786360 4795 scope.go:117] "RemoveContainer" containerID="b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0" Feb 19 21:29:00 crc kubenswrapper[4795]: E0219 21:29:00.786671 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.802779 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.834282 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.836298 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.836345 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.836359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.836379 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.836393 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.855351 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.871764 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.890797 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.904606 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.919805 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.936778 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://175d9b8b72214e8c405a22f93d7de35ce4d1c1c7ae10ba8f56f2245d20c54517\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:44Z\\\",\\\"message\\\":\\\"cal for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:28:44.491789 6229 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:28:44.491795 6229 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:28:44.491778 6229 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_openshift-console/console_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.194:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d7d7b270-1480-47f8-bdf9-690dbab310cb}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0219 21:28:44.491057 6229 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} wa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.937866 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.937893 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.937903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.937918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.937930 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:00Z","lastTransitionTime":"2026-02-19T21:29:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.947081 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.956150 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.965639 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.976825 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:00 crc kubenswrapper[4795]: I0219 21:29:00.989738 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:00Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.001475 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.008883 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.017531 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.032747 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.040276 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.040419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.040496 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.040590 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.040669 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.042357 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.051983 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.060944 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.070028 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.079411 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.093552 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.102762 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.112523 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.122692 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.131483 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.143062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.143236 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.143304 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.143404 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.143785 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.143701 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:01Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.246519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.246764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.246778 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.246791 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.246801 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.349602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.349645 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.349659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.349676 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.349687 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.452268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.452628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.453002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.453378 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.453591 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.502445 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 00:04:47.283024595 +0000 UTC Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.511027 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:01 crc kubenswrapper[4795]: E0219 21:29:01.511149 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.555456 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.555690 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.555752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.555825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.555893 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.659613 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.659662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.659677 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.659697 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.659712 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.762565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.762614 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.762630 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.762649 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.762663 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.865444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.865505 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.865523 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.865547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.865563 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.968155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.968246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.968270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.968300 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:01 crc kubenswrapper[4795]: I0219 21:29:01.968321 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:01Z","lastTransitionTime":"2026-02-19T21:29:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.070453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.070518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.070540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.070568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.070591 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.173860 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.173917 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.173936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.173960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.173978 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.277025 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.277066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.277078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.277094 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.277107 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.381133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.381246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.381275 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.381307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.381330 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.484263 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.484306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.484317 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.484336 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.484349 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.502921 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 15:44:37.145204642 +0000 UTC Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.511300 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:02 crc kubenswrapper[4795]: E0219 21:29:02.511880 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.511681 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:02 crc kubenswrapper[4795]: E0219 21:29:02.512060 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.511302 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:02 crc kubenswrapper[4795]: E0219 21:29:02.512216 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.586806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.586841 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.586857 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.586875 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.586887 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.689987 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.690043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.690061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.690084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.690101 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.792002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.792065 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.792085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.792108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.792126 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.894995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.895059 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.895081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.895109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.895131 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.998416 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.998488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.998511 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.998540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:02 crc kubenswrapper[4795]: I0219 21:29:02.998583 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:02Z","lastTransitionTime":"2026-02-19T21:29:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.101812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.101870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.101888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.101910 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.101928 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.180766 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:03 crc kubenswrapper[4795]: E0219 21:29:03.181021 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:29:03 crc kubenswrapper[4795]: E0219 21:29:03.181111 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs podName:1b1b4346-e02e-4614-b2ff-e4628046a92f nodeName:}" failed. No retries permitted until 2026-02-19 21:29:19.181086317 +0000 UTC m=+70.373604221 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs") pod "network-metrics-daemon-ff4bs" (UID: "1b1b4346-e02e-4614-b2ff-e4628046a92f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.205388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.205438 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.205455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.205478 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.205502 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.308711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.308774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.308794 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.308819 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.308838 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.411754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.412111 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.412381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.412591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.412859 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.504029 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 13:08:21.640712588 +0000 UTC Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.510979 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:03 crc kubenswrapper[4795]: E0219 21:29:03.511189 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.516131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.516215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.516236 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.516263 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.516284 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.618714 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.618758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.618773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.618797 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.618813 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.722144 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.722254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.722283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.722313 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.722336 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.825102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.825140 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.825148 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.825187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.825198 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.927767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.928195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.928215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.928233 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:03 crc kubenswrapper[4795]: I0219 21:29:03.928246 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:03Z","lastTransitionTime":"2026-02-19T21:29:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.031077 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.031562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.031581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.031605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.031621 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.133096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.133125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.133133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.133147 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.133157 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.235863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.235933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.235956 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.235981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.236000 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.338768 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.339104 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.339208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.339314 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.339391 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.441620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.441657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.441665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.441680 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.441689 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.505438 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 05:54:58.080200067 +0000 UTC Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.510768 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:04 crc kubenswrapper[4795]: E0219 21:29:04.510943 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.511015 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.511127 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:04 crc kubenswrapper[4795]: E0219 21:29:04.511352 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:04 crc kubenswrapper[4795]: E0219 21:29:04.511517 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.544923 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.544966 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.544975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.544990 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.545002 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.648053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.648099 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.648117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.648140 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.648158 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.750728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.750767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.750777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.750791 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.750802 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.852708 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.852774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.852795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.852821 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.852845 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.957987 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.958024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.958033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.958047 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:04 crc kubenswrapper[4795]: I0219 21:29:04.958055 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:04Z","lastTransitionTime":"2026-02-19T21:29:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.060537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.060579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.060595 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.060620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.060638 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.162920 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.162978 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.163001 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.163028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.163049 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.265846 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.265882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.265891 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.265912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.265930 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.288925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.288991 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.289015 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.289046 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.289071 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: E0219 21:29:05.309713 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:05Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.314696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.314734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.314743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.314762 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.314780 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: E0219 21:29:05.332104 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:05Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.336705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.336756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.336775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.336803 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.336822 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: E0219 21:29:05.353815 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:05Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.358771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.358831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.358850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.358874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.358895 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: E0219 21:29:05.376505 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:05Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.382514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.382747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.382779 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.382812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.382834 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: E0219 21:29:05.403020 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:05Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:05 crc kubenswrapper[4795]: E0219 21:29:05.403396 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.405654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.405705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.405725 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.405748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.405764 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.505657 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 09:15:58.84426875 +0000 UTC Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.507772 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.507802 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.507813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.507872 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.507882 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.511388 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:05 crc kubenswrapper[4795]: E0219 21:29:05.511512 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.611103 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.611139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.611152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.611183 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.611194 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.713784 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.713827 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.713838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.713855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.713864 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.816563 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.816618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.816635 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.816660 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.816676 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.920344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.920403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.920422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.920446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:05 crc kubenswrapper[4795]: I0219 21:29:05.920465 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:05Z","lastTransitionTime":"2026-02-19T21:29:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.023417 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.023480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.023496 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.023517 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.023534 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.127489 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.127592 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.127622 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.127655 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.127678 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.230625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.230681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.230716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.230744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.230768 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.333467 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.333520 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.333538 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.333563 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.333579 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.435836 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.435885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.435906 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.435925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.435938 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.506559 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 00:53:17.385416089 +0000 UTC Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.510902 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.510902 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:06 crc kubenswrapper[4795]: E0219 21:29:06.511057 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.511077 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:06 crc kubenswrapper[4795]: E0219 21:29:06.511148 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:06 crc kubenswrapper[4795]: E0219 21:29:06.511261 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.538712 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.538759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.538783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.538806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.538821 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.641827 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.641871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.641882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.641899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.641911 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.745347 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.745427 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.745439 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.745457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.745469 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.848194 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.848246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.848258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.848279 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.848291 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.951635 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.951681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.951696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.951720 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:06 crc kubenswrapper[4795]: I0219 21:29:06.951735 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:06Z","lastTransitionTime":"2026-02-19T21:29:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.055405 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.055455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.055471 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.055497 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.055516 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.158525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.158625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.158701 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.158733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.158750 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.261663 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.261713 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.261731 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.261756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.261774 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.364137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.364254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.364279 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.364306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.364323 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.467264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.467316 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.467335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.467358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.467377 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.507144 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 17:00:12.980213872 +0000 UTC Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.511770 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:07 crc kubenswrapper[4795]: E0219 21:29:07.512012 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.571764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.571831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.571856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.571894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.571919 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.673797 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.673827 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.673837 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.673850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.673859 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.775901 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.775934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.775943 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.775956 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.775965 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.878340 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.878377 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.878388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.878405 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.878416 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.981149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.981231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.981248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.981272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:07 crc kubenswrapper[4795]: I0219 21:29:07.981288 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:07Z","lastTransitionTime":"2026-02-19T21:29:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.084357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.084776 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.084927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.085085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.085254 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.188894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.188957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.188974 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.189038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.189060 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.293143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.293264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.293283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.293306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.293323 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.396572 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.396634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.396651 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.396674 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.396693 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.499865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.499929 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.499961 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.499990 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.500007 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.508242 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 11:22:33.292224279 +0000 UTC Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.512130 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.512236 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:08 crc kubenswrapper[4795]: E0219 21:29:08.512501 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.512137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:08 crc kubenswrapper[4795]: E0219 21:29:08.512584 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:08 crc kubenswrapper[4795]: E0219 21:29:08.512625 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.603273 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.603356 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.603369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.603407 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.603419 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.705773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.705832 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.705849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.705871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.705888 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.808533 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.808562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.808570 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.808582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.808590 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.910992 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.911053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.911079 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.911108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:08 crc kubenswrapper[4795]: I0219 21:29:08.911129 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:08Z","lastTransitionTime":"2026-02-19T21:29:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.014012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.014081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.014107 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.014137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.014584 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.121867 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.122033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.122054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.122078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.122094 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.224583 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.224662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.224679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.224712 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.224731 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.328099 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.328143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.328161 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.328217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.328235 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.431150 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.431227 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.431247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.431270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.431284 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.508598 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 09:42:45.041659164 +0000 UTC Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.510848 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:09 crc kubenswrapper[4795]: E0219 21:29:09.511108 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.528612 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.534775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.534831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.534850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.534874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.534892 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.545025 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.560979 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.577693 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.594024 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.604840 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.621757 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.633375 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.642394 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.642422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.642432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.642447 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.642456 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.646592 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.663261 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.675441 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.686646 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.706387 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.717049 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.730740 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.744063 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.745722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.745768 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.745780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.745800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.745819 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.755828 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.769754 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:09Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.847900 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.847934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.847944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.847959 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.847969 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.949934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.949997 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.950019 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.950046 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:09 crc kubenswrapper[4795]: I0219 21:29:09.950068 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:09Z","lastTransitionTime":"2026-02-19T21:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.052671 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.052703 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.052711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.052724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.052733 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.154713 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.154746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.154756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.154769 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.154777 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.256900 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.256948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.256960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.256979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.256991 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.358728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.358765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.358774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.358786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.358795 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.461045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.461075 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.461083 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.461098 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.461107 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.509983 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 22:42:33.973508553 +0000 UTC Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.511256 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.511284 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.511338 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:10 crc kubenswrapper[4795]: E0219 21:29:10.511419 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:10 crc kubenswrapper[4795]: E0219 21:29:10.511507 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:10 crc kubenswrapper[4795]: E0219 21:29:10.511599 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.564312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.564358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.564368 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.564382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.564392 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.667427 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.667484 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.667500 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.667526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.667543 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.770665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.770715 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.770728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.770747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.770760 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.873718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.873781 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.873803 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.873838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.873861 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.976314 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.976361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.976373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.976393 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:10 crc kubenswrapper[4795]: I0219 21:29:10.976406 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:10Z","lastTransitionTime":"2026-02-19T21:29:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.081571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.081619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.081633 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.081681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.081694 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.186349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.186393 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.186402 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.186419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.186430 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.289683 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.289755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.289777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.289806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.289827 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.392733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.392795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.392813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.392838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.392856 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.495707 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.495773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.495798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.495834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.495861 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.510835 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 00:45:14.525645727 +0000 UTC Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.511089 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:11 crc kubenswrapper[4795]: E0219 21:29:11.511365 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.599187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.599228 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.599240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.599257 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.599268 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.702604 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.702641 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.702652 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.702667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.702678 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.805000 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.805046 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.805064 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.805086 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.805103 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.908347 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.908389 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.908404 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.908426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:11 crc kubenswrapper[4795]: I0219 21:29:11.908442 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:11Z","lastTransitionTime":"2026-02-19T21:29:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.012282 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.012314 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.012325 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.012339 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.012349 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.114700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.114748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.114764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.114787 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.114804 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.217319 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.217372 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.217396 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.217430 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.217454 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.320656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.320714 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.320732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.320755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.320773 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.423874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.423940 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.423955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.424026 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.424040 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.511536 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:12 crc kubenswrapper[4795]: E0219 21:29:12.511658 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.511846 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:12 crc kubenswrapper[4795]: E0219 21:29:12.511895 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.512021 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:12 crc kubenswrapper[4795]: E0219 21:29:12.512079 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.512249 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 00:09:09.471252734 +0000 UTC Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.526188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.526207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.526214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.526224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.526232 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.628374 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.628412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.628420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.628434 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.628445 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.730397 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.730442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.730457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.730476 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.730486 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.832796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.832839 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.832852 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.832871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.832883 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.935565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.935870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.936003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.936209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:12 crc kubenswrapper[4795]: I0219 21:29:12.936358 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:12Z","lastTransitionTime":"2026-02-19T21:29:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.039457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.039751 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.039858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.039961 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.040042 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.143125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.143207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.143225 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.143252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.143270 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.246547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.246594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.246607 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.246625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.246644 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.349498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.349556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.349578 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.349601 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.349618 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.452865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.452910 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.452921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.452940 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.452952 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.510695 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:13 crc kubenswrapper[4795]: E0219 21:29:13.511151 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.516330 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 07:56:49.459256912 +0000 UTC Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.555927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.555997 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.556013 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.556031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.556043 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.658271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.658306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.658314 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.658325 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.658333 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.760685 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.760732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.760754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.760770 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.760779 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.862528 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.862566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.862577 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.862593 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.862605 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.965142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.965197 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.965207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.965221 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:13 crc kubenswrapper[4795]: I0219 21:29:13.965231 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:13Z","lastTransitionTime":"2026-02-19T21:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.067369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.067403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.067412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.067424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.067432 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.170063 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.170088 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.170096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.170108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.170117 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.272281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.272313 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.272321 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.272335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.272344 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.374484 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.374516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.374532 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.374548 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.374559 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.477136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.477201 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.477215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.477231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.477241 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.510646 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.510694 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.510734 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:14 crc kubenswrapper[4795]: E0219 21:29:14.510757 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:14 crc kubenswrapper[4795]: E0219 21:29:14.510813 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:14 crc kubenswrapper[4795]: E0219 21:29:14.510917 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.517176 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 18:19:41.952522536 +0000 UTC Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.579343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.579409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.579421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.579436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.579445 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.682682 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.682971 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.682984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.682999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.683011 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.785394 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.785430 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.785439 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.785453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.785461 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.887418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.887466 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.887478 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.887496 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.887508 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.989639 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.989699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.989716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.989738 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:14 crc kubenswrapper[4795]: I0219 21:29:14.989755 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:14Z","lastTransitionTime":"2026-02-19T21:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.092254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.092323 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.092342 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.092366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.092382 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.194541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.194582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.194595 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.194611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.194631 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.296780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.296920 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.296933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.296948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.296958 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.399949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.399988 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.399998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.400014 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.400024 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.497765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.497806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.497816 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.497831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.497841 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: E0219 21:29:15.511140 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:15Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.511611 4795 scope.go:117] "RemoveContainer" containerID="b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.511725 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:15 crc kubenswrapper[4795]: E0219 21:29:15.511858 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:29:15 crc kubenswrapper[4795]: E0219 21:29:15.511883 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.515595 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.515650 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.515670 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.515709 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.515727 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.517283 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 09:22:51.128643495 +0000 UTC Feb 19 21:29:15 crc kubenswrapper[4795]: E0219 21:29:15.538038 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:15Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.541388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.541443 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.541462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.541484 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.541500 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: E0219 21:29:15.557314 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:15Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.561097 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.561129 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.561140 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.561156 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.561191 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: E0219 21:29:15.583994 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:15Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.587975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.588020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.588032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.588049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.588061 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: E0219 21:29:15.603713 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:15Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:15 crc kubenswrapper[4795]: E0219 21:29:15.603958 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.605634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.605704 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.605722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.605747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.605766 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.708049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.708093 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.708103 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.708119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.708132 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.810811 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.811100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.811222 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.811358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.811487 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.913696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.913955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.914227 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.914445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:15 crc kubenswrapper[4795]: I0219 21:29:15.914568 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:15Z","lastTransitionTime":"2026-02-19T21:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.017408 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.017443 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.017452 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.017468 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.017478 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.120725 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.121199 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.121444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.122064 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.122588 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.225851 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.226105 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.226193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.226264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.226347 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.329114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.329180 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.329198 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.329217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.329229 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.431362 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.431390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.431398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.431412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.431420 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.511178 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:16 crc kubenswrapper[4795]: E0219 21:29:16.511275 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.511303 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.511332 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:16 crc kubenswrapper[4795]: E0219 21:29:16.511375 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:16 crc kubenswrapper[4795]: E0219 21:29:16.511491 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.518537 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 12:08:48.955260037 +0000 UTC Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.533143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.533545 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.533639 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.533764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.533863 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.636699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.636949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.637021 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.637087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.637155 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.739644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.739681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.739691 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.739705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.739715 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.842991 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.843032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.843044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.843061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.843074 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.944814 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.944871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.944879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.944891 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:16 crc kubenswrapper[4795]: I0219 21:29:16.944899 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:16Z","lastTransitionTime":"2026-02-19T21:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.047060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.047413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.047527 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.047622 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.047714 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.150465 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.150792 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.150951 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.151246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.151388 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.253456 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.253485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.253493 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.253519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.253529 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.356190 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.356220 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.356228 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.356259 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.356269 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.458559 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.458591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.458605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.458619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.458630 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.511253 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:17 crc kubenswrapper[4795]: E0219 21:29:17.511450 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.519610 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 21:22:51.006375964 +0000 UTC Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.560694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.560714 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.560721 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.560738 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.560747 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.662796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.662825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.662836 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.662849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.662856 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.764716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.764779 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.764794 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.764814 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.764825 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.867989 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.868028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.868037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.868050 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.868060 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.970626 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.970669 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.970679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.970695 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:17 crc kubenswrapper[4795]: I0219 21:29:17.970705 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:17Z","lastTransitionTime":"2026-02-19T21:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.072609 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.072659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.072675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.072696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.072712 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.174822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.174869 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.174919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.174936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.174948 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.277072 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.277110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.277121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.277138 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.277148 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.379089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.379122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.379131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.379143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.379152 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.481543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.481586 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.481599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.481616 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.481626 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.511055 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.511107 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:18 crc kubenswrapper[4795]: E0219 21:29:18.511215 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.511288 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:18 crc kubenswrapper[4795]: E0219 21:29:18.511392 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:18 crc kubenswrapper[4795]: E0219 21:29:18.511547 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.520119 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 16:10:37.38792007 +0000 UTC Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.583494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.583534 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.583543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.583557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.583565 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.685844 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.685888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.685899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.685917 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.685928 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.788995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.789038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.789048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.789066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.789081 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.892033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.892065 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.892073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.892089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.892098 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.994973 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.995028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.995045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.995070 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:18 crc kubenswrapper[4795]: I0219 21:29:18.995087 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:18Z","lastTransitionTime":"2026-02-19T21:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.097705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.097741 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.097752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.097766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.097777 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.199717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.199751 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.199759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.199772 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.199780 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.251911 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:19 crc kubenswrapper[4795]: E0219 21:29:19.252073 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:29:19 crc kubenswrapper[4795]: E0219 21:29:19.252156 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs podName:1b1b4346-e02e-4614-b2ff-e4628046a92f nodeName:}" failed. No retries permitted until 2026-02-19 21:29:51.252141645 +0000 UTC m=+102.444659509 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs") pod "network-metrics-daemon-ff4bs" (UID: "1b1b4346-e02e-4614-b2ff-e4628046a92f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.301536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.301573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.301584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.301598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.301609 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.404247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.404279 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.404290 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.404305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.404316 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.505891 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.505931 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.505939 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.505954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.505963 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.511313 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:19 crc kubenswrapper[4795]: E0219 21:29:19.511422 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.520988 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 20:16:01.400728976 +0000 UTC Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.532143 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.548526 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.557425 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.570821 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.582248 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.597558 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.608023 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.608400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.608475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.608537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.608600 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.608656 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.619493 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.629756 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.641045 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.652098 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.662826 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.681793 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.692455 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.706444 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.714839 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.714874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.714882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.714896 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.714904 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.718820 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.730372 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.739641 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:19Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.817420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.817683 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.817779 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.817870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.817957 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.920362 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.920398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.920410 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.920425 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:19 crc kubenswrapper[4795]: I0219 21:29:19.920434 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:19Z","lastTransitionTime":"2026-02-19T21:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.021984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.022193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.022283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.022353 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.022407 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.124786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.124820 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.124829 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.124843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.124881 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.226990 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.227017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.227025 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.227040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.227049 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.329584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.329627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.329644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.329666 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.329682 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.431612 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.431663 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.431677 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.431695 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.431708 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.511371 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.511447 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:20 crc kubenswrapper[4795]: E0219 21:29:20.511479 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:20 crc kubenswrapper[4795]: E0219 21:29:20.511581 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.511922 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:20 crc kubenswrapper[4795]: E0219 21:29:20.512222 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.521566 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 01:11:40.660456072 +0000 UTC Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.534214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.534246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.534257 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.534283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.534295 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.637612 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.637858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.637947 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.638031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.638122 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.740492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.740522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.740530 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.740542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.740550 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.840382 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/0.log" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.840427 4795 generic.go:334] "Generic (PLEG): container finished" podID="e967392b-9bd8-4111-b1b9-96d503a19668" containerID="ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d" exitCode=1 Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.840453 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5p6d9" event={"ID":"e967392b-9bd8-4111-b1b9-96d503a19668","Type":"ContainerDied","Data":"ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.840803 4795 scope.go:117] "RemoveContainer" containerID="ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.842406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.842441 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.842451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.842475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.842486 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.859670 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.877346 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:29:20Z\\\",\\\"message\\\":\\\"2026-02-19T21:28:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a\\\\n2026-02-19T21:28:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a to /host/opt/cni/bin/\\\\n2026-02-19T21:28:35Z [verbose] multus-daemon started\\\\n2026-02-19T21:28:35Z [verbose] Readiness Indicator file check\\\\n2026-02-19T21:29:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.888220 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.896981 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.913396 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.923382 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.931424 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.943793 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.944937 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.944955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.944965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.944979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.945004 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:20Z","lastTransitionTime":"2026-02-19T21:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.955269 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.967216 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:20 crc kubenswrapper[4795]: I0219 21:29:20.990353 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:20Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.002477 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.016063 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.029068 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.040682 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.047108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.047133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.047141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.047154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.047173 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.051850 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.063226 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.074835 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.149620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.149674 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.149683 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.149697 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.149705 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.252573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.252602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.252610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.252623 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.252632 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.355603 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.355650 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.355659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.355675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.355684 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.458516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.458574 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.458586 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.458608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.458620 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.510834 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:21 crc kubenswrapper[4795]: E0219 21:29:21.510983 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.521726 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 07:15:39.942938654 +0000 UTC Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.561493 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.561520 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.561529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.561546 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.561555 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.664094 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.664150 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.664174 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.664191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.664202 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.767217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.767556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.767653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.767746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.767825 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.846299 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/0.log" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.846365 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5p6d9" event={"ID":"e967392b-9bd8-4111-b1b9-96d503a19668","Type":"ContainerStarted","Data":"02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.858102 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.870490 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.870641 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.870744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.870833 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.870919 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.872650 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.884757 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:29:20Z\\\",\\\"message\\\":\\\"2026-02-19T21:28:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a\\\\n2026-02-19T21:28:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a to /host/opt/cni/bin/\\\\n2026-02-19T21:28:35Z [verbose] multus-daemon started\\\\n2026-02-19T21:28:35Z [verbose] Readiness Indicator file check\\\\n2026-02-19T21:29:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.896589 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.911075 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.929970 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.942849 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.954592 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.964336 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.972948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.973387 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.973556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.974009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.974391 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:21Z","lastTransitionTime":"2026-02-19T21:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.976437 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:21 crc kubenswrapper[4795]: I0219 21:29:21.986889 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:21Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.003596 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:22Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.012934 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:22Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.022396 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:22Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.033367 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:22Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.042522 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:22Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.053032 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:22Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.064331 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:22Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.076832 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.076972 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.077066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.077182 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.077267 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.180433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.180458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.180467 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.180481 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.180489 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.283254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.283281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.283289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.283302 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.283311 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.385746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.386033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.386131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.386254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.386318 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.488346 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.488652 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.488741 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.488846 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.488933 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.510886 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.510930 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:22 crc kubenswrapper[4795]: E0219 21:29:22.511000 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:22 crc kubenswrapper[4795]: E0219 21:29:22.511066 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.511326 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:22 crc kubenswrapper[4795]: E0219 21:29:22.511490 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.523062 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 17:11:27.599983745 +0000 UTC Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.591504 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.591547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.591556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.591571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.591582 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.693654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.693960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.694020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.694079 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.694133 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.797625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.797675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.797692 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.797715 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.797733 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.901891 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.901951 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.901968 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.901994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:22 crc kubenswrapper[4795]: I0219 21:29:22.902011 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:22Z","lastTransitionTime":"2026-02-19T21:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.004339 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.004389 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.004400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.004416 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.004429 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.107567 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.107614 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.107625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.107643 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.107655 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.210525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.210575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.210590 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.210610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.210627 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.314449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.314482 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.314491 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.314504 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.314512 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.416842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.416887 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.416920 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.416936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.416947 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.510766 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:23 crc kubenswrapper[4795]: E0219 21:29:23.510893 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.518888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.518922 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.518932 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.518944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.518954 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.524079 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 10:12:48.009016533 +0000 UTC Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.621380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.621417 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.621425 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.621445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.621455 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.724562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.724638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.724663 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.724693 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.724716 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.827056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.827096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.827106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.827122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.827131 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.928904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.928935 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.928944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.928961 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:23 crc kubenswrapper[4795]: I0219 21:29:23.928969 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:23Z","lastTransitionTime":"2026-02-19T21:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.031552 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.031588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.031599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.031615 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.031626 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.133653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.133710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.133730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.133750 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.133761 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.236239 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.236291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.236305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.236331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.236347 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.338657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.338716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.338729 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.338748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.338761 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.441218 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.441265 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.441281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.441296 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.441304 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.511427 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.511427 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:24 crc kubenswrapper[4795]: E0219 21:29:24.511600 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.511689 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:24 crc kubenswrapper[4795]: E0219 21:29:24.511818 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:24 crc kubenswrapper[4795]: E0219 21:29:24.511893 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.525074 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 04:30:32.870465938 +0000 UTC Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.543955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.543993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.544002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.544018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.544027 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.646939 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.646999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.647017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.647042 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.647060 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.750620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.750717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.750738 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.750763 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.750782 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.860202 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.860284 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.860305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.860331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.860352 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.963507 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.963566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.963584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.963608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:24 crc kubenswrapper[4795]: I0219 21:29:24.963625 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:24Z","lastTransitionTime":"2026-02-19T21:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.066621 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.066684 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.066702 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.066728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.066746 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.170598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.170658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.170674 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.170698 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.170715 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.273953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.274147 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.274233 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.274265 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.274321 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.378711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.378773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.378790 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.378814 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.378831 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.481970 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.482031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.482049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.482074 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.482092 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.511372 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:25 crc kubenswrapper[4795]: E0219 21:29:25.511561 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.525440 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 09:42:02.795383918 +0000 UTC Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.585966 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.586038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.586063 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.586091 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.586112 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.690130 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.690242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.690270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.690297 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.690316 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.792592 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.792623 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.792634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.792650 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.792662 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.895417 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.895483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.895505 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.895537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.895561 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.958252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.958345 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.958363 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.958386 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.958565 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: E0219 21:29:25.978679 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:25Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.984127 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.984223 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.984242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.984261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:25 crc kubenswrapper[4795]: I0219 21:29:25.984313 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:25Z","lastTransitionTime":"2026-02-19T21:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:25 crc kubenswrapper[4795]: E0219 21:29:25.998590 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:25Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.003465 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.003498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.003522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.003536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.003544 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: E0219 21:29:26.018932 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:26Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.023860 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.023895 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.023907 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.023925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.023938 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: E0219 21:29:26.043075 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:26Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.047271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.047305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.047319 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.047337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.047350 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: E0219 21:29:26.064196 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:26Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:26 crc kubenswrapper[4795]: E0219 21:29:26.064527 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.066875 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.066927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.066951 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.066977 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.066999 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.169202 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.169276 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.169296 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.169320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.169338 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.271658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.271715 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.271734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.271758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.271781 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.374423 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.374470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.374487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.374513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.374530 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.477548 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.477616 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.477638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.477666 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.477690 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.510983 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.511052 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:26 crc kubenswrapper[4795]: E0219 21:29:26.511377 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:26 crc kubenswrapper[4795]: E0219 21:29:26.511548 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.511130 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:26 crc kubenswrapper[4795]: E0219 21:29:26.512077 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.525674 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 14:42:15.284865661 +0000 UTC Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.580385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.580424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.580434 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.580450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.580460 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.686897 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.686981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.687005 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.687039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.687187 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.790653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.790710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.790726 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.790749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.790767 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.892858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.892912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.892929 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.892951 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.892968 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.996309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.996370 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.996387 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.996410 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:26 crc kubenswrapper[4795]: I0219 21:29:26.996427 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:26Z","lastTransitionTime":"2026-02-19T21:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.099363 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.099422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.099438 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.099460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.099477 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.202743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.202813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.202835 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.202865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.202894 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.305918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.305984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.306005 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.306031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.306048 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.408720 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.408777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.408798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.408821 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.408838 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.510780 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:27 crc kubenswrapper[4795]: E0219 21:29:27.510971 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.511516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.511553 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.511568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.511588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.511604 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.526378 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 17:07:09.152403131 +0000 UTC Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.614696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.614749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.614766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.614785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.614795 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.717143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.717246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.717268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.717296 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.717319 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.819483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.819557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.819568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.819584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.819594 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.921974 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.922012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.922021 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.922033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:27 crc kubenswrapper[4795]: I0219 21:29:27.922042 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:27Z","lastTransitionTime":"2026-02-19T21:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.024586 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.024660 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.024672 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.024688 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.024697 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.126865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.126927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.126945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.126973 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.126990 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.229505 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.229580 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.229603 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.229631 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.229653 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.332197 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.332235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.332244 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.332259 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.332267 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.434149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.434221 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.434240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.434260 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.434273 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.511599 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.511657 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.511622 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:28 crc kubenswrapper[4795]: E0219 21:29:28.511795 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:28 crc kubenswrapper[4795]: E0219 21:29:28.511891 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:28 crc kubenswrapper[4795]: E0219 21:29:28.512279 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.512707 4795 scope.go:117] "RemoveContainer" containerID="b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.527557 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 02:20:50.184964946 +0000 UTC Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.536825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.536878 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.536895 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.536919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.536936 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.639184 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.639221 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.639229 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.639243 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.639252 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.742294 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.742338 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.742349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.742366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.742378 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.844620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.844654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.844665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.844702 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.844711 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.872380 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/2.log" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.875502 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.875872 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.899705 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:28Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.919853 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:28Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.938791 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:28Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.946926 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.946980 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.946993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.947010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.947021 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:28Z","lastTransitionTime":"2026-02-19T21:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.957613 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:28Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.975902 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:28Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:28 crc kubenswrapper[4795]: I0219 21:29:28.990248 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:29:20Z\\\",\\\"message\\\":\\\"2026-02-19T21:28:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a\\\\n2026-02-19T21:28:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a to /host/opt/cni/bin/\\\\n2026-02-19T21:28:35Z [verbose] multus-daemon started\\\\n2026-02-19T21:28:35Z [verbose] Readiness Indicator file check\\\\n2026-02-19T21:29:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:28Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.001744 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.012279 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.029024 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.041228 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.048667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.048722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.048739 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.048760 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.048778 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.053064 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.068571 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.082363 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.093927 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.113703 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.125419 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.139249 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.152694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.152734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.152745 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.152762 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.152774 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.156944 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.254894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.254936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.254947 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.254963 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.254975 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.357512 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.357549 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.357557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.357571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.357582 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.460644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.460694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.460710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.460732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.460749 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.511231 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:29 crc kubenswrapper[4795]: E0219 21:29:29.511363 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.524705 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.528454 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 21:26:29.53218513 +0000 UTC Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.541766 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.553424 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.563340 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.563391 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.563404 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.563418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.563427 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.569337 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.580929 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.590850 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.612244 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.626272 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.636559 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.665508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.665552 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.665569 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.665590 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.665605 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.673081 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.695049 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.704786 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.714610 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.732179 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.743017 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.752868 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.768458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.768532 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.768560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.768593 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.768617 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.768673 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.781479 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:29:20Z\\\",\\\"message\\\":\\\"2026-02-19T21:28:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a\\\\n2026-02-19T21:28:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a to /host/opt/cni/bin/\\\\n2026-02-19T21:28:35Z [verbose] multus-daemon started\\\\n2026-02-19T21:28:35Z [verbose] Readiness Indicator file check\\\\n2026-02-19T21:29:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.871144 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.871191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.871202 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.871215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.871225 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.881391 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/3.log" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.882214 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/2.log" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.884911 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" exitCode=1 Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.884957 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.885001 4795 scope.go:117] "RemoveContainer" containerID="b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.885659 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:29:29 crc kubenswrapper[4795]: E0219 21:29:29.885880 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.901093 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.910712 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.924423 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.939545 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.953424 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.973269 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f7e5606fe334a1c2d8901263d401f8fa2a441992646852386a77967eaa84a0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:28:59Z\\\",\\\"message\\\":\\\"oup\\\\\\\"}}}\\\\nI0219 21:28:59.338551 6443 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0219 21:28:59.338560 6443 services_controller.go:452] Built service openshift-config-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 21:28:59.338574 6443 services_controller.go:453] Built service openshift-config-operator/metrics template LB for network=default: []services.LB{}\\\\nF0219 21:28:59.338577 6443 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:28:59Z is after 2025-08-24T17:21:41Z]\\\\nI0219 21:28:59.338588 6443 services_controller.go:454] Service open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:29:29Z\\\",\\\"message\\\":\\\"dding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:29:29.388497 6838 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:29:29.388507 6838 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:29:29.388515 6838 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:29:29.388436 6838 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0219 21:29:29.388402 6838 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0219 21:29:29.388466 6838 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-jvnv5 in node crc\\\\nI0219 21:29:29.388540 6838 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-jvnv5 after 0 failed attempt(s)\\\\nI0219 21:29:29.388544 6838 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-jvnv5\\\\nI0219 21:29:29.388279 6838 obj_retry.go:303] R\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:29:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.973444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.973465 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.973474 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.973487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.973496 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:29Z","lastTransitionTime":"2026-02-19T21:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.985260 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:29 crc kubenswrapper[4795]: I0219 21:29:29.998506 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:29Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.013726 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.025740 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.037383 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.058603 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.069070 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.075419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.075452 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.075462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.075494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.075503 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.077815 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.090828 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.101691 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:29:20Z\\\",\\\"message\\\":\\\"2026-02-19T21:28:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a\\\\n2026-02-19T21:28:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a to /host/opt/cni/bin/\\\\n2026-02-19T21:28:35Z [verbose] multus-daemon started\\\\n2026-02-19T21:28:35Z [verbose] Readiness Indicator file check\\\\n2026-02-19T21:29:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.109966 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.120829 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.178293 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.178611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.178749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.178903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.179054 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.281447 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.281491 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.281500 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.281513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.281522 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.383694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.383762 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.383784 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.383811 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.383833 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.486406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.486727 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.486907 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.487124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.487334 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.510871 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.510980 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.510983 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:30 crc kubenswrapper[4795]: E0219 21:29:30.511690 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:30 crc kubenswrapper[4795]: E0219 21:29:30.511803 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:30 crc kubenswrapper[4795]: E0219 21:29:30.511923 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.529488 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 04:08:42.738903615 +0000 UTC Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.590051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.590095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.590108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.590126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.590138 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.692805 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.692862 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.692879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.692904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.692922 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.795095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.795128 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.795139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.795153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.795182 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.889415 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/3.log" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.892915 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:29:30 crc kubenswrapper[4795]: E0219 21:29:30.893081 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.896727 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.896747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.896755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.896767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.896775 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.909754 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67ec5415-424e-40b7-9beb-171cd1f3dbe9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://230058a44868b325845b4045f4c763a9a9dbc95cfdf8fe94d86cee2799e9b1b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce91447a4948ba71025642de9bcde76acb613b1a110eab19b334f244114636f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmblv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfpbz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.923313 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538a8ea5-cec4-4706-8187-425879d18d4e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7547886d43376602238ac541e375478233fda6564afeb045cb84192812dd7136\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd41d9aec7be2e2d5e5ec8b06259364f0b7a1bf4fadbd8fe8abb8ff625cd6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fd942d9f97c8b019171b7d06c9f4a4b85decbf641bd03a76701b7d27520db62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c85f029538174c7a4516d507fc84620e018838ff13e0a950cb25e184da8cd51f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.936425 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.950803 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.973107 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf5bd36-b46b-4a06-8291-cae9f3988330\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:29:29Z\\\",\\\"message\\\":\\\"dding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:29:29.388497 6838 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-l29c7 in node crc\\\\nI0219 21:29:29.388507 6838 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-l29c7 after 0 failed attempt(s)\\\\nI0219 21:29:29.388515 6838 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-plugins-l29c7\\\\nI0219 21:29:29.388436 6838 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0219 21:29:29.388402 6838 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0219 21:29:29.388466 6838 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-jvnv5 in node crc\\\\nI0219 21:29:29.388540 6838 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-jvnv5 after 0 failed attempt(s)\\\\nI0219 21:29:29.388544 6838 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-jvnv5\\\\nI0219 21:29:29.388279 6838 obj_retry.go:303] R\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:29:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nrjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4qphl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.989816 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:30Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.998593 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.998634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.998647 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.998665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:30 crc kubenswrapper[4795]: I0219 21:29:30.998680 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:30Z","lastTransitionTime":"2026-02-19T21:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.006350 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.023411 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63eeceac048a3aa1ff9551ffba02cda05b9e08b57f78bbbcc368b04462386bb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.037350 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b239675e1392344b3b1fa6b12caea0851212adb7e3e28178545804fa4f61e4c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.048431 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jvnv5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93ec0ce2-79c0-42a4-88e2-71065ec8ff9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://558f4c8ca1cd10a6c978aa69d9df061716040e6eb4b31c34bb3b69008bae2c7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-89q2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jvnv5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.066630 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b1b4346-e02e-4614-b2ff-e4628046a92f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rw77f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ff4bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.095622 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405c08bd-3feb-45df-993f-2cb5e737cb6a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9415ce5a9989f2b286fff8f33781b5f1c510b61ef5e7af7050ed7f1a322dc6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fe281962bb5076796c2b27f81ffe63c9310bf49f5989714cbe2fc0fe5b6f32d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://689f35bbe7bc51411e25a7a6e4bce91b45d819ed6697ed322c4d5b4b3a244451\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2cdc3d1e391f142caa70f86039fc213117c5a49f6e39eb770867385d781eca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cd220ea624daba9c66de834e5fb2b0dd067d489fa005bacca4280b010e2ef7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://567145a211099c88276a4f1d5a32abfbd7826bff98006fe6c09924349ac2b99f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb46f6bb20905b6950c3ac6f7a5430c69a5fb18d311f2b44ac411843dc091b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b7c4a8d46693f4eb998c15f26f008b750495433ab22261ce77b00beeefd3fc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.105418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.105456 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.105472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.105516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.105530 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.115687 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.131423 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-blzsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9643227-37ca-4e4a-b9bc-371b18d67edc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6b76680f2d3d45237abba3bfaeadf20719b560077c08f624ca2fbe4cc1af865\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmtf8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-blzsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.147640 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-l29c7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ca8b5d-3bc5-40de-8217-3ba5dd7724b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47780602108de299051f362241c1b15d1d233ab44173752ba23240ff2aad0af7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40e292afd4826a76af82d83725ea35a4f1d054f557b50afba41642065500f8a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30cf5f90bb084f7e56c134dcf40e490158fcef7475c3245d37541899add3223e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7fc880209eaceb11ccd1b7227c30f72e84b3d3451a1bace24ca8bbd97c35561\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f675454a9299b77e4a088ca80882e39a740a411f973679a25a6f5fbe0330c8bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://649bcc41f6d05767ecc0aade5032870263b86e7109a5844eae140cb7acdac094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01538a8d9ff9ffaa050d68dbc632cce9cfa99784bc6dfe56e649b5ed6ecc3d72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f4pzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-l29c7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.167030 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5p6d9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e967392b-9bd8-4111-b1b9-96d503a19668\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T21:29:20Z\\\",\\\"message\\\":\\\"2026-02-19T21:28:35+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a\\\\n2026-02-19T21:28:35+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c80439f-0348-449a-8009-7180f0a7269a to /host/opt/cni/bin/\\\\n2026-02-19T21:28:35Z [verbose] multus-daemon started\\\\n2026-02-19T21:28:35Z [verbose] Readiness Indicator file check\\\\n2026-02-19T21:29:20Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qx6hj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5p6d9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.188498 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3f806cd585c75d6301715b64d4be95c86629169c43691f45c2d03bf9abe4f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7da030f8f5566b94b6e8a2b2a779890a7869d7c498ae11b6a880e7218faff697\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.202998 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7591bc58-96f5-486a-8653-0ad93938b019\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9964359f056111e41a0c304a0100fea26514cdc34f3e64f376a682913d6742f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wwrxf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:33Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fxj5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:31Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.207916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.207975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.207995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.208020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.208037 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.311492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.311983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.312079 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.312188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.312283 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.415182 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.415224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.415234 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.415248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.415258 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.511556 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:31 crc kubenswrapper[4795]: E0219 21:29:31.511874 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.516843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.516895 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.516914 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.516934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.516970 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.526716 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.529665 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 22:33:06.832992682 +0000 UTC Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.619874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.619929 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.619946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.619970 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.619989 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.722749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.722795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.722806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.722819 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.722829 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.825380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.825429 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.825448 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.825473 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.825491 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.928028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.928096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.928120 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.928153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:31 crc kubenswrapper[4795]: I0219 21:29:31.928210 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:31Z","lastTransitionTime":"2026-02-19T21:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.032785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.032828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.032840 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.032856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.032868 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.136322 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.136655 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.136843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.137010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.137128 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.240942 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.240999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.241017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.241037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.241049 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.343813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.343866 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.343879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.343897 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.343911 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.381496 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.381922 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.381887344 +0000 UTC m=+147.574405248 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.447278 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.447326 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.447337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.447355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.447367 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.482348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.482425 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.482474 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482543 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482630 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.482612434 +0000 UTC m=+147.675130298 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.482552 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482685 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482692 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482713 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482737 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482782 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.482762068 +0000 UTC m=+147.675279952 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482805 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.482796459 +0000 UTC m=+147.675314333 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482860 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482885 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482900 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.482969 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.482948864 +0000 UTC m=+147.675466728 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.511185 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.511299 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.511356 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.511364 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.511578 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:32 crc kubenswrapper[4795]: E0219 21:29:32.511665 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.530416 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 12:48:54.001068845 +0000 UTC Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.548953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.549109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.549187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.549264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.549330 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.651638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.651681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.651692 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.651710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.651721 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.754011 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.754319 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.754472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.754633 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.754765 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.858509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.858548 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.858559 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.858576 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.858587 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.961467 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.961530 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.961550 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.961576 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:32 crc kubenswrapper[4795]: I0219 21:29:32.961595 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:32Z","lastTransitionTime":"2026-02-19T21:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.064870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.064925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.064942 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.064966 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.064983 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.168461 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.168520 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.168536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.168562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.168580 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.271535 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.271631 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.271643 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.271661 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.271675 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.374536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.374626 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.374643 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.374669 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.374688 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.477868 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.477934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.477956 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.477983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.478003 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.510812 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:33 crc kubenswrapper[4795]: E0219 21:29:33.511067 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.531235 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 06:34:30.991199645 +0000 UTC Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.580804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.580892 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.580916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.580939 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.580956 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.684537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.684598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.684615 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.684638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.684655 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.788112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.788155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.788209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.788238 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.788256 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.891722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.891780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.891800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.891826 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.891843 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.994989 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.995067 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.995088 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.995119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:33 crc kubenswrapper[4795]: I0219 21:29:33.995142 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:33Z","lastTransitionTime":"2026-02-19T21:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.098601 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.098660 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.098677 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.098703 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.098720 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.202568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.202632 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.202648 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.202670 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.202687 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.305775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.305832 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.305849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.305873 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.305889 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.408431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.408497 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.408514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.408539 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.408557 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.510553 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.510681 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:34 crc kubenswrapper[4795]: E0219 21:29:34.510876 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.510992 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:34 crc kubenswrapper[4795]: E0219 21:29:34.511091 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:34 crc kubenswrapper[4795]: E0219 21:29:34.511143 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.512366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.512433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.512454 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.512478 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.512495 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.531903 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 17:36:52.018493243 +0000 UTC Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.615420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.615488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.615510 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.615537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.615553 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.718285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.718352 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.718369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.718396 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.718415 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.821696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.821758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.821776 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.821801 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.821820 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.924623 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.924673 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.924685 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.924704 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:34 crc kubenswrapper[4795]: I0219 21:29:34.924721 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:34Z","lastTransitionTime":"2026-02-19T21:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.028645 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.028694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.028707 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.028724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.028736 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.131790 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.131857 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.131873 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.131898 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.131916 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.235308 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.235381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.235398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.235424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.235444 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.339159 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.339249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.339266 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.339290 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.339307 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.442058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.442113 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.442130 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.442155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.442208 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.511500 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:35 crc kubenswrapper[4795]: E0219 21:29:35.511697 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.532831 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 20:07:13.390109887 +0000 UTC Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.545758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.545843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.545869 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.545902 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.545979 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.649148 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.649278 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.649300 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.649322 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.649341 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.751772 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.751835 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.751846 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.751860 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.751871 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.854863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.854918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.854928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.854944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.854953 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.957557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.957599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.957623 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.957638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:35 crc kubenswrapper[4795]: I0219 21:29:35.957649 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:35Z","lastTransitionTime":"2026-02-19T21:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.060545 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.060608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.060625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.060649 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.060666 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.163451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.163479 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.163487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.163500 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.163508 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.262422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.262480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.262498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.262524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.262547 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.279961 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.283647 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.283927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.284133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.284420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.284653 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.298076 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.302978 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.303331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.303587 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.303761 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.303922 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.318003 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.322551 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.322580 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.322591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.322608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.322620 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.337925 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.342150 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.342240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.342250 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.342265 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.342275 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.355557 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T21:29:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0a3e1a0-f657-4990-877d-cc9b59bcb3d8\\\",\\\"systemUUID\\\":\\\"5c733625-a853-45dd-88a0-4f8c78e571ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:36Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.355734 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.357513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.357594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.357611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.357633 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.357649 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.461161 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.461769 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.461912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.462040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.462209 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.510614 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.510619 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.510715 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.510954 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.511196 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:36 crc kubenswrapper[4795]: E0219 21:29:36.511637 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.535244 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 01:03:59.930389504 +0000 UTC Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.565678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.565730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.565742 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.565759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.565769 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.669293 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.669363 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.669381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.669407 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.669429 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.771937 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.771976 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.771984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.772000 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.772009 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.874971 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.875019 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.875030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.875047 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.875059 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.977597 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.977645 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.977657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.977679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:36 crc kubenswrapper[4795]: I0219 21:29:36.977692 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:36Z","lastTransitionTime":"2026-02-19T21:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.079987 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.080045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.080060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.080081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.080097 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.183106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.183382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.183460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.183535 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.183596 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.287420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.287475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.287528 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.287554 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.287604 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.390716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.390754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.390765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.390785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.390796 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.493458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.493522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.493541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.493566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.493583 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.511445 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:37 crc kubenswrapper[4795]: E0219 21:29:37.511636 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.535927 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 02:17:45.367206751 +0000 UTC Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.596458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.596509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.596530 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.596556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.596573 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.699763 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.699809 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.699828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.699853 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.699871 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.801918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.801998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.802017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.802062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.802079 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.904854 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.904921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.904944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.905008 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:37 crc kubenswrapper[4795]: I0219 21:29:37.905034 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:37Z","lastTransitionTime":"2026-02-19T21:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.008554 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.008615 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.008632 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.008657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.008674 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.112746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.113133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.113310 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.113447 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.113600 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.217034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.217136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.217218 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.217266 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.217292 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.321003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.321066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.321083 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.321112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.321130 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.423628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.423687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.423739 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.423765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.423789 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.510762 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.510821 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.510864 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:38 crc kubenswrapper[4795]: E0219 21:29:38.510956 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:38 crc kubenswrapper[4795]: E0219 21:29:38.511076 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:38 crc kubenswrapper[4795]: E0219 21:29:38.511183 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.526186 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.526232 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.526249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.526272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.526294 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.536909 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 22:01:22.62978722 +0000 UTC Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.629928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.629985 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.629998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.630018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.630030 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.732579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.732612 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.732624 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.732638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.732648 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.835580 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.835644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.835662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.835687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.835705 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.938455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.938528 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.938546 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.938563 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:38 crc kubenswrapper[4795]: I0219 21:29:38.938607 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:38Z","lastTransitionTime":"2026-02-19T21:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.041546 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.041622 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.041639 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.041654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.041665 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.144994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.145066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.145087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.145113 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.145130 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.248093 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.248143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.248155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.248203 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.248216 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.351031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.351081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.351110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.351131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.351144 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.459382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.459443 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.459466 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.459514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.459537 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.510979 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:39 crc kubenswrapper[4795]: E0219 21:29:39.511640 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.533361 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c2292-d8c1-42cd-9a68-25e9dee8a334\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T21:28:28Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 21:28:23.063012 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 21:28:23.063787 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2397988729/tls.crt::/tmp/serving-cert-2397988729/tls.key\\\\\\\"\\\\nI0219 21:28:28.507843 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 21:28:28.514117 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 21:28:28.514142 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 21:28:28.514191 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 21:28:28.514210 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 21:28:28.530780 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 21:28:28.530820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 21:28:28.530844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 21:28:28.530857 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 21:28:28.530865 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 21:28:28.530875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0219 21:28:28.531454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0219 21:28:28.531988 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T21:28:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.537081 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 21:08:06.320435583 +0000 UTC Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.549962 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45f7fe05-a5f0-4296-80f5-9c8cd59afb95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T21:28:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ae61d10d79a82b8d2e11aee70d7bb13c429feaa4b7ce1a7746cadab1169d96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44b2ef22256f35fadefb61f59a542aa0df8bc5648285e2624b924391b80766c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ad6763b33a3d85ce3446b4056d2703b9f342137b6a4aa23a3ae9870df8d0c29\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T21:28:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T21:28:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T21:29:39Z is after 2025-08-24T17:21:41Z" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.564648 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.564731 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.564744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.564762 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.564775 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.656473 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=68.656455011 podStartE2EDuration="1m8.656455011s" podCreationTimestamp="2026-02-19 21:28:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.655509475 +0000 UTC m=+90.848027329" watchObservedRunningTime="2026-02-19 21:29:39.656455011 +0000 UTC m=+90.848972875" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.667365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.667388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.667396 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.667408 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.667417 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.696433 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-l29c7" podStartSLOduration=66.696410872 podStartE2EDuration="1m6.696410872s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.695243779 +0000 UTC m=+90.887761663" watchObservedRunningTime="2026-02-19 21:29:39.696410872 +0000 UTC m=+90.888928736" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.696842 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-blzsk" podStartSLOduration=66.696833774 podStartE2EDuration="1m6.696833774s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.678632659 +0000 UTC m=+90.871150593" watchObservedRunningTime="2026-02-19 21:29:39.696833774 +0000 UTC m=+90.889351648" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.709539 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5p6d9" podStartSLOduration=66.709515242 podStartE2EDuration="1m6.709515242s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.707683991 +0000 UTC m=+90.900201875" watchObservedRunningTime="2026-02-19 21:29:39.709515242 +0000 UTC m=+90.902033136" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.718970 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jvnv5" podStartSLOduration=66.718948899 podStartE2EDuration="1m6.718948899s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.718103625 +0000 UTC m=+90.910621499" watchObservedRunningTime="2026-02-19 21:29:39.718948899 +0000 UTC m=+90.911466773" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.749940 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podStartSLOduration=66.749922266 podStartE2EDuration="1m6.749922266s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.749817343 +0000 UTC m=+90.942335247" watchObservedRunningTime="2026-02-19 21:29:39.749922266 +0000 UTC m=+90.942440130" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.762841 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=8.76281736 podStartE2EDuration="8.76281736s" podCreationTimestamp="2026-02-19 21:29:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.761961466 +0000 UTC m=+90.954479330" watchObservedRunningTime="2026-02-19 21:29:39.76281736 +0000 UTC m=+90.955335264" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.770031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.770228 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.770309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.770387 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.770444 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.779642 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=39.779618336 podStartE2EDuration="39.779618336s" podCreationTimestamp="2026-02-19 21:29:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.778441902 +0000 UTC m=+90.970959786" watchObservedRunningTime="2026-02-19 21:29:39.779618336 +0000 UTC m=+90.972136240" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.832983 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfpbz" podStartSLOduration=66.832963415 podStartE2EDuration="1m6.832963415s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:39.832039439 +0000 UTC m=+91.024557303" watchObservedRunningTime="2026-02-19 21:29:39.832963415 +0000 UTC m=+91.025481299" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.872680 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.872915 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.872982 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.873055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.873128 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.975727 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.975774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.975786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.975806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:39 crc kubenswrapper[4795]: I0219 21:29:39.975818 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:39Z","lastTransitionTime":"2026-02-19T21:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.077568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.077796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.077903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.077984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.078060 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.180767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.180794 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.180802 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.180814 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.180822 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.283525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.283863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.284024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.284192 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.284334 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.386594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.386964 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.387103 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.387285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.387422 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.490084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.490291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.490373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.490449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.490526 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.511490 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.511526 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:40 crc kubenswrapper[4795]: E0219 21:29:40.511656 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:40 crc kubenswrapper[4795]: E0219 21:29:40.511759 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.511526 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:40 crc kubenswrapper[4795]: E0219 21:29:40.512260 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.537719 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 14:30:04.713145803 +0000 UTC Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.593717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.593783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.593804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.593830 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.593850 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.698031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.698088 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.698105 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.698130 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.698147 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.801152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.801454 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.801521 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.801598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.801675 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.905301 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.905361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.905379 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.905402 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:40 crc kubenswrapper[4795]: I0219 21:29:40.905420 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:40Z","lastTransitionTime":"2026-02-19T21:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.007551 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.007798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.007865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.007938 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.008000 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.111133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.111248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.111273 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.111301 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.111323 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.214490 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.214757 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.214825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.214894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.214953 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.317629 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.317693 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.317716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.317743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.317765 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.420158 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.420228 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.420237 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.420253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.420262 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.510924 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:41 crc kubenswrapper[4795]: E0219 21:29:41.511040 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.522871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.522915 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.522949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.522968 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.522979 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.538759 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 16:47:44.756201663 +0000 UTC Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.625499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.625541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.625554 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.625573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.625584 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.728309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.728353 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.728367 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.728381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.728393 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.831146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.831209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.831217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.831232 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.831241 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.933522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.933600 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.933614 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.933631 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:41 crc kubenswrapper[4795]: I0219 21:29:41.933646 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:41Z","lastTransitionTime":"2026-02-19T21:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.036666 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.036723 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.036736 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.036760 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.036777 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.139383 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.139442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.139460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.139488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.139514 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.243368 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.243426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.243444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.243467 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.243484 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.346298 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.346382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.346403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.346428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.346446 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.448645 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.448679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.448687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.448701 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.448710 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.511526 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.511905 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.512208 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:42 crc kubenswrapper[4795]: E0219 21:29:42.512312 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:42 crc kubenswrapper[4795]: E0219 21:29:42.512439 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:42 crc kubenswrapper[4795]: E0219 21:29:42.512541 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.512616 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:29:42 crc kubenswrapper[4795]: E0219 21:29:42.512799 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.539434 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 22:10:18.445984639 +0000 UTC Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.551458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.551501 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.551518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.551542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.551558 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.653526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.653562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.653571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.653583 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.653592 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.755206 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.755274 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.755284 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.755298 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.755307 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.857328 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.857358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.857368 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.857385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.857397 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.959697 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.959722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.959731 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.959743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:42 crc kubenswrapper[4795]: I0219 21:29:42.959752 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:42Z","lastTransitionTime":"2026-02-19T21:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.061895 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.061922 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.061931 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.061944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.061952 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.164207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.164255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.164268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.164285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.164302 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.266469 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.266515 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.266526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.266543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.266554 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.368813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.368845 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.368855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.368871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.368881 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.471098 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.471135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.471143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.471157 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.471185 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.510847 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:43 crc kubenswrapper[4795]: E0219 21:29:43.510991 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.540679 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 22:56:32.84282829 +0000 UTC Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.573212 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.573233 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.573242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.573255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.573283 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.675210 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.675447 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.675509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.675574 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.675642 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.777451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.777516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.777539 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.777573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.777595 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.880379 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.880413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.880425 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.880441 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.880453 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.983062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.983124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.983143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.983218 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:43 crc kubenswrapper[4795]: I0219 21:29:43.983244 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:43Z","lastTransitionTime":"2026-02-19T21:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.086518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.086565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.086574 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.086588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.086598 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.189663 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.189710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.189722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.189741 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.189753 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.292998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.293073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.293091 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.293114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.293132 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.396526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.396588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.396605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.396629 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.396647 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.499844 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.499874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.499884 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.499900 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.499909 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.511470 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.511551 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.511485 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:44 crc kubenswrapper[4795]: E0219 21:29:44.511708 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:44 crc kubenswrapper[4795]: E0219 21:29:44.511829 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:44 crc kubenswrapper[4795]: E0219 21:29:44.511967 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.541230 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 07:48:40.414171664 +0000 UTC Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.602435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.602508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.602526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.602551 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.602570 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.706203 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.706281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.706303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.706333 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.706355 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.809664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.810068 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.810328 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.810552 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.810756 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.914449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.914504 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.914522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.914546 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:44 crc kubenswrapper[4795]: I0219 21:29:44.914563 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:44Z","lastTransitionTime":"2026-02-19T21:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.018357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.018752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.018950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.019152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.019407 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.121665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.121969 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.122088 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.122227 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.122344 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.225315 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.225637 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.225819 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.225980 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.226144 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.329037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.329107 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.329127 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.329154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.329221 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.431375 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.431430 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.431446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.431470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.431487 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.511068 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:45 crc kubenswrapper[4795]: E0219 21:29:45.511466 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.534020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.534050 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.534058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.534069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.534078 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.542812 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 08:10:57.798662672 +0000 UTC Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.636692 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.636726 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.636735 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.636749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.636759 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.738820 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.738858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.738869 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.738884 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.738897 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.841040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.841100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.841117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.841141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.841158 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.943689 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.943903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.943994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.944055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:45 crc kubenswrapper[4795]: I0219 21:29:45.944119 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:45Z","lastTransitionTime":"2026-02-19T21:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.047813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.047861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.047870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.047886 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.047895 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:46Z","lastTransitionTime":"2026-02-19T21:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.150994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.151348 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.151642 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.151780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.151927 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:46Z","lastTransitionTime":"2026-02-19T21:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.254379 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.254416 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.254434 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.254452 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.254464 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:46Z","lastTransitionTime":"2026-02-19T21:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.357465 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.357540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.357562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.357589 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.357611 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:46Z","lastTransitionTime":"2026-02-19T21:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.461142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.461272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.461298 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.461329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.461350 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:46Z","lastTransitionTime":"2026-02-19T21:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.510863 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.510902 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.510998 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:46 crc kubenswrapper[4795]: E0219 21:29:46.511061 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:46 crc kubenswrapper[4795]: E0219 21:29:46.511266 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:46 crc kubenswrapper[4795]: E0219 21:29:46.511450 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.544037 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 14:31:49.604200175 +0000 UTC Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.564415 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.564638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.564700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.564795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.564889 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:46Z","lastTransitionTime":"2026-02-19T21:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.667726 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.667767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.667774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.667786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.667796 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:46Z","lastTransitionTime":"2026-02-19T21:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.690224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.690483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.690822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.691141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.691489 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T21:29:46Z","lastTransitionTime":"2026-02-19T21:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.747679 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6"] Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.748055 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.750570 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.750575 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.750831 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.753231 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.773320 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.773305587 podStartE2EDuration="1m18.773305587s" podCreationTimestamp="2026-02-19 21:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:46.772686729 +0000 UTC m=+97.965204633" watchObservedRunningTime="2026-02-19 21:29:46.773305587 +0000 UTC m=+97.965823451" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.790124 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=71.790106442 podStartE2EDuration="1m11.790106442s" podCreationTimestamp="2026-02-19 21:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:46.789721231 +0000 UTC m=+97.982239115" watchObservedRunningTime="2026-02-19 21:29:46.790106442 +0000 UTC m=+97.982624306" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.838320 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a207a66-b652-4d53-9424-48b3f88d4c93-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.838370 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a207a66-b652-4d53-9424-48b3f88d4c93-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.838414 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8a207a66-b652-4d53-9424-48b3f88d4c93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.838471 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8a207a66-b652-4d53-9424-48b3f88d4c93-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.838495 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a207a66-b652-4d53-9424-48b3f88d4c93-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.939036 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a207a66-b652-4d53-9424-48b3f88d4c93-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.939082 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a207a66-b652-4d53-9424-48b3f88d4c93-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.939118 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8a207a66-b652-4d53-9424-48b3f88d4c93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.939134 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8a207a66-b652-4d53-9424-48b3f88d4c93-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.939150 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a207a66-b652-4d53-9424-48b3f88d4c93-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.939288 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8a207a66-b652-4d53-9424-48b3f88d4c93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.939337 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8a207a66-b652-4d53-9424-48b3f88d4c93-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.939884 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a207a66-b652-4d53-9424-48b3f88d4c93-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.947734 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a207a66-b652-4d53-9424-48b3f88d4c93-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:46 crc kubenswrapper[4795]: I0219 21:29:46.957757 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a207a66-b652-4d53-9424-48b3f88d4c93-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c4gh6\" (UID: \"8a207a66-b652-4d53-9424-48b3f88d4c93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:47 crc kubenswrapper[4795]: I0219 21:29:47.068527 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" Feb 19 21:29:47 crc kubenswrapper[4795]: W0219 21:29:47.088060 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a207a66_b652_4d53_9424_48b3f88d4c93.slice/crio-e34a32ae52ae70c6baaa514d5575954eeeee61e00f13acd641a6a1b2305f1382 WatchSource:0}: Error finding container e34a32ae52ae70c6baaa514d5575954eeeee61e00f13acd641a6a1b2305f1382: Status 404 returned error can't find the container with id e34a32ae52ae70c6baaa514d5575954eeeee61e00f13acd641a6a1b2305f1382 Feb 19 21:29:47 crc kubenswrapper[4795]: I0219 21:29:47.511775 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:47 crc kubenswrapper[4795]: E0219 21:29:47.511947 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:47 crc kubenswrapper[4795]: I0219 21:29:47.544717 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 09:24:55.411879249 +0000 UTC Feb 19 21:29:47 crc kubenswrapper[4795]: I0219 21:29:47.544814 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 19 21:29:47 crc kubenswrapper[4795]: I0219 21:29:47.552879 4795 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 21:29:47 crc kubenswrapper[4795]: I0219 21:29:47.947589 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" event={"ID":"8a207a66-b652-4d53-9424-48b3f88d4c93","Type":"ContainerStarted","Data":"fd72049883e0afb17c0fdce1f500ad100bfded9e38a3818a9f0a3e29a04b09c7"} Feb 19 21:29:47 crc kubenswrapper[4795]: I0219 21:29:47.947666 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" event={"ID":"8a207a66-b652-4d53-9424-48b3f88d4c93","Type":"ContainerStarted","Data":"e34a32ae52ae70c6baaa514d5575954eeeee61e00f13acd641a6a1b2305f1382"} Feb 19 21:29:48 crc kubenswrapper[4795]: I0219 21:29:48.510815 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:48 crc kubenswrapper[4795]: E0219 21:29:48.511027 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:48 crc kubenswrapper[4795]: I0219 21:29:48.511333 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:48 crc kubenswrapper[4795]: I0219 21:29:48.511403 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:48 crc kubenswrapper[4795]: E0219 21:29:48.511532 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:48 crc kubenswrapper[4795]: E0219 21:29:48.511739 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:49 crc kubenswrapper[4795]: I0219 21:29:49.511030 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:49 crc kubenswrapper[4795]: E0219 21:29:49.511930 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:50 crc kubenswrapper[4795]: I0219 21:29:50.510985 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:50 crc kubenswrapper[4795]: I0219 21:29:50.511046 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:50 crc kubenswrapper[4795]: I0219 21:29:50.511058 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:50 crc kubenswrapper[4795]: E0219 21:29:50.511122 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:50 crc kubenswrapper[4795]: E0219 21:29:50.511243 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:50 crc kubenswrapper[4795]: E0219 21:29:50.511343 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:51 crc kubenswrapper[4795]: I0219 21:29:51.288357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:51 crc kubenswrapper[4795]: E0219 21:29:51.288506 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:29:51 crc kubenswrapper[4795]: E0219 21:29:51.288582 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs podName:1b1b4346-e02e-4614-b2ff-e4628046a92f nodeName:}" failed. No retries permitted until 2026-02-19 21:30:55.28856317 +0000 UTC m=+166.481081044 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs") pod "network-metrics-daemon-ff4bs" (UID: "1b1b4346-e02e-4614-b2ff-e4628046a92f") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 21:29:51 crc kubenswrapper[4795]: I0219 21:29:51.510934 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:51 crc kubenswrapper[4795]: E0219 21:29:51.511095 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:52 crc kubenswrapper[4795]: I0219 21:29:52.510660 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:52 crc kubenswrapper[4795]: E0219 21:29:52.510791 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:52 crc kubenswrapper[4795]: I0219 21:29:52.510668 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:52 crc kubenswrapper[4795]: I0219 21:29:52.510660 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:52 crc kubenswrapper[4795]: E0219 21:29:52.510855 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:52 crc kubenswrapper[4795]: E0219 21:29:52.511023 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:53 crc kubenswrapper[4795]: I0219 21:29:53.510810 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:53 crc kubenswrapper[4795]: E0219 21:29:53.511110 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:54 crc kubenswrapper[4795]: I0219 21:29:54.510586 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:54 crc kubenswrapper[4795]: I0219 21:29:54.510652 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:54 crc kubenswrapper[4795]: I0219 21:29:54.510651 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:54 crc kubenswrapper[4795]: E0219 21:29:54.510795 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:54 crc kubenswrapper[4795]: E0219 21:29:54.510906 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:54 crc kubenswrapper[4795]: E0219 21:29:54.511035 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:55 crc kubenswrapper[4795]: I0219 21:29:55.511464 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:55 crc kubenswrapper[4795]: E0219 21:29:55.512215 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:55 crc kubenswrapper[4795]: I0219 21:29:55.512725 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:29:55 crc kubenswrapper[4795]: E0219 21:29:55.513008 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4qphl_openshift-ovn-kubernetes(adf5bd36-b46b-4a06-8291-cae9f3988330)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" Feb 19 21:29:56 crc kubenswrapper[4795]: I0219 21:29:56.511292 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:56 crc kubenswrapper[4795]: I0219 21:29:56.511344 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:56 crc kubenswrapper[4795]: I0219 21:29:56.511292 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:56 crc kubenswrapper[4795]: E0219 21:29:56.511559 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:56 crc kubenswrapper[4795]: E0219 21:29:56.511719 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:56 crc kubenswrapper[4795]: E0219 21:29:56.511842 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:57 crc kubenswrapper[4795]: I0219 21:29:57.511476 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:57 crc kubenswrapper[4795]: E0219 21:29:57.511615 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:29:58 crc kubenswrapper[4795]: I0219 21:29:58.511084 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:29:58 crc kubenswrapper[4795]: I0219 21:29:58.511640 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:29:58 crc kubenswrapper[4795]: E0219 21:29:58.511728 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:29:58 crc kubenswrapper[4795]: I0219 21:29:58.511900 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:29:58 crc kubenswrapper[4795]: E0219 21:29:58.511958 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:29:58 crc kubenswrapper[4795]: E0219 21:29:58.512062 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:29:59 crc kubenswrapper[4795]: I0219 21:29:59.511378 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:29:59 crc kubenswrapper[4795]: E0219 21:29:59.512775 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:00 crc kubenswrapper[4795]: I0219 21:30:00.511553 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:00 crc kubenswrapper[4795]: I0219 21:30:00.511552 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:00 crc kubenswrapper[4795]: I0219 21:30:00.511645 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:00 crc kubenswrapper[4795]: E0219 21:30:00.511885 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:00 crc kubenswrapper[4795]: E0219 21:30:00.512003 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:00 crc kubenswrapper[4795]: E0219 21:30:00.512676 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:01 crc kubenswrapper[4795]: I0219 21:30:01.511744 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:01 crc kubenswrapper[4795]: E0219 21:30:01.512321 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:02 crc kubenswrapper[4795]: I0219 21:30:02.511282 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:02 crc kubenswrapper[4795]: I0219 21:30:02.511339 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:02 crc kubenswrapper[4795]: E0219 21:30:02.511480 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:02 crc kubenswrapper[4795]: I0219 21:30:02.511351 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:02 crc kubenswrapper[4795]: E0219 21:30:02.511629 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:02 crc kubenswrapper[4795]: E0219 21:30:02.511671 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:03 crc kubenswrapper[4795]: I0219 21:30:03.511274 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:03 crc kubenswrapper[4795]: E0219 21:30:03.511452 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:04 crc kubenswrapper[4795]: I0219 21:30:04.511242 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:04 crc kubenswrapper[4795]: I0219 21:30:04.511339 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:04 crc kubenswrapper[4795]: E0219 21:30:04.511450 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:04 crc kubenswrapper[4795]: E0219 21:30:04.511715 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:04 crc kubenswrapper[4795]: I0219 21:30:04.512362 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:04 crc kubenswrapper[4795]: E0219 21:30:04.512541 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:05 crc kubenswrapper[4795]: I0219 21:30:05.511050 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:05 crc kubenswrapper[4795]: E0219 21:30:05.511253 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:06 crc kubenswrapper[4795]: I0219 21:30:06.511019 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:06 crc kubenswrapper[4795]: I0219 21:30:06.511128 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:06 crc kubenswrapper[4795]: I0219 21:30:06.511134 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:06 crc kubenswrapper[4795]: E0219 21:30:06.511438 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:06 crc kubenswrapper[4795]: E0219 21:30:06.511526 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:06 crc kubenswrapper[4795]: E0219 21:30:06.511616 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:07 crc kubenswrapper[4795]: I0219 21:30:07.004647 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/1.log" Feb 19 21:30:07 crc kubenswrapper[4795]: I0219 21:30:07.005691 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/0.log" Feb 19 21:30:07 crc kubenswrapper[4795]: I0219 21:30:07.005753 4795 generic.go:334] "Generic (PLEG): container finished" podID="e967392b-9bd8-4111-b1b9-96d503a19668" containerID="02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b" exitCode=1 Feb 19 21:30:07 crc kubenswrapper[4795]: I0219 21:30:07.005809 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5p6d9" event={"ID":"e967392b-9bd8-4111-b1b9-96d503a19668","Type":"ContainerDied","Data":"02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b"} Feb 19 21:30:07 crc kubenswrapper[4795]: I0219 21:30:07.005910 4795 scope.go:117] "RemoveContainer" containerID="ab48852eb20d566b09c7adda6c6afee64240486d91e3788538ed9b47fbcae59d" Feb 19 21:30:07 crc kubenswrapper[4795]: I0219 21:30:07.007217 4795 scope.go:117] "RemoveContainer" containerID="02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b" Feb 19 21:30:07 crc kubenswrapper[4795]: E0219 21:30:07.007658 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-5p6d9_openshift-multus(e967392b-9bd8-4111-b1b9-96d503a19668)\"" pod="openshift-multus/multus-5p6d9" podUID="e967392b-9bd8-4111-b1b9-96d503a19668" Feb 19 21:30:07 crc kubenswrapper[4795]: I0219 21:30:07.031303 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c4gh6" podStartSLOduration=94.031278446 podStartE2EDuration="1m34.031278446s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:29:47.967341117 +0000 UTC m=+99.159859011" watchObservedRunningTime="2026-02-19 21:30:07.031278446 +0000 UTC m=+118.223796340" Feb 19 21:30:07 crc kubenswrapper[4795]: I0219 21:30:07.511043 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:07 crc kubenswrapper[4795]: E0219 21:30:07.511211 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:08 crc kubenswrapper[4795]: I0219 21:30:08.010791 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/1.log" Feb 19 21:30:08 crc kubenswrapper[4795]: I0219 21:30:08.511383 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:08 crc kubenswrapper[4795]: I0219 21:30:08.511476 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:08 crc kubenswrapper[4795]: I0219 21:30:08.511396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:08 crc kubenswrapper[4795]: E0219 21:30:08.511589 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:08 crc kubenswrapper[4795]: E0219 21:30:08.511719 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:08 crc kubenswrapper[4795]: E0219 21:30:08.511865 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:09 crc kubenswrapper[4795]: E0219 21:30:09.478727 4795 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 19 21:30:09 crc kubenswrapper[4795]: I0219 21:30:09.511555 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:09 crc kubenswrapper[4795]: E0219 21:30:09.512508 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:09 crc kubenswrapper[4795]: E0219 21:30:09.596555 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 21:30:10 crc kubenswrapper[4795]: I0219 21:30:10.511436 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:10 crc kubenswrapper[4795]: I0219 21:30:10.511466 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:10 crc kubenswrapper[4795]: I0219 21:30:10.511506 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:10 crc kubenswrapper[4795]: E0219 21:30:10.511763 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:10 crc kubenswrapper[4795]: E0219 21:30:10.511838 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:10 crc kubenswrapper[4795]: E0219 21:30:10.512007 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:10 crc kubenswrapper[4795]: I0219 21:30:10.513309 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:30:11 crc kubenswrapper[4795]: I0219 21:30:11.022582 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/3.log" Feb 19 21:30:11 crc kubenswrapper[4795]: I0219 21:30:11.026807 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerStarted","Data":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} Feb 19 21:30:11 crc kubenswrapper[4795]: I0219 21:30:11.027291 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:30:11 crc kubenswrapper[4795]: I0219 21:30:11.068812 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podStartSLOduration=98.068794743 podStartE2EDuration="1m38.068794743s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:11.067787985 +0000 UTC m=+122.260305849" watchObservedRunningTime="2026-02-19 21:30:11.068794743 +0000 UTC m=+122.261312607" Feb 19 21:30:11 crc kubenswrapper[4795]: I0219 21:30:11.414973 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ff4bs"] Feb 19 21:30:11 crc kubenswrapper[4795]: I0219 21:30:11.415085 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:11 crc kubenswrapper[4795]: E0219 21:30:11.415189 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:12 crc kubenswrapper[4795]: I0219 21:30:12.511041 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:12 crc kubenswrapper[4795]: I0219 21:30:12.511067 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:12 crc kubenswrapper[4795]: I0219 21:30:12.511041 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:12 crc kubenswrapper[4795]: E0219 21:30:12.511193 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:12 crc kubenswrapper[4795]: E0219 21:30:12.511291 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:12 crc kubenswrapper[4795]: E0219 21:30:12.511356 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:13 crc kubenswrapper[4795]: I0219 21:30:13.511766 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:13 crc kubenswrapper[4795]: E0219 21:30:13.512001 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:14 crc kubenswrapper[4795]: I0219 21:30:14.511661 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:14 crc kubenswrapper[4795]: I0219 21:30:14.511691 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:14 crc kubenswrapper[4795]: E0219 21:30:14.511938 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:14 crc kubenswrapper[4795]: E0219 21:30:14.511975 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:14 crc kubenswrapper[4795]: I0219 21:30:14.511699 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:14 crc kubenswrapper[4795]: E0219 21:30:14.512410 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:14 crc kubenswrapper[4795]: E0219 21:30:14.597881 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 21:30:15 crc kubenswrapper[4795]: I0219 21:30:15.510777 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:15 crc kubenswrapper[4795]: E0219 21:30:15.510978 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:16 crc kubenswrapper[4795]: I0219 21:30:16.511350 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:16 crc kubenswrapper[4795]: I0219 21:30:16.511360 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:16 crc kubenswrapper[4795]: E0219 21:30:16.511533 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:16 crc kubenswrapper[4795]: I0219 21:30:16.511375 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:16 crc kubenswrapper[4795]: E0219 21:30:16.511599 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:16 crc kubenswrapper[4795]: E0219 21:30:16.511709 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:16 crc kubenswrapper[4795]: I0219 21:30:16.661925 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:30:17 crc kubenswrapper[4795]: I0219 21:30:17.511482 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:17 crc kubenswrapper[4795]: E0219 21:30:17.511619 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:18 crc kubenswrapper[4795]: I0219 21:30:18.510624 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:18 crc kubenswrapper[4795]: I0219 21:30:18.510671 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:18 crc kubenswrapper[4795]: E0219 21:30:18.510812 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:18 crc kubenswrapper[4795]: I0219 21:30:18.510858 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:18 crc kubenswrapper[4795]: E0219 21:30:18.511045 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:18 crc kubenswrapper[4795]: E0219 21:30:18.511216 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:19 crc kubenswrapper[4795]: I0219 21:30:19.510856 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:19 crc kubenswrapper[4795]: E0219 21:30:19.513059 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:19 crc kubenswrapper[4795]: E0219 21:30:19.599435 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 21:30:20 crc kubenswrapper[4795]: I0219 21:30:20.511132 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:20 crc kubenswrapper[4795]: I0219 21:30:20.511211 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:20 crc kubenswrapper[4795]: I0219 21:30:20.511150 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:20 crc kubenswrapper[4795]: E0219 21:30:20.511343 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:20 crc kubenswrapper[4795]: E0219 21:30:20.511601 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:20 crc kubenswrapper[4795]: E0219 21:30:20.511790 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:20 crc kubenswrapper[4795]: I0219 21:30:20.512112 4795 scope.go:117] "RemoveContainer" containerID="02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b" Feb 19 21:30:21 crc kubenswrapper[4795]: I0219 21:30:21.058976 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/1.log" Feb 19 21:30:21 crc kubenswrapper[4795]: I0219 21:30:21.059345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5p6d9" event={"ID":"e967392b-9bd8-4111-b1b9-96d503a19668","Type":"ContainerStarted","Data":"2822f57eb80a87e2600f1bdd3a3189e4bded6fbc0e489210ee08ea133cbf2aa9"} Feb 19 21:30:21 crc kubenswrapper[4795]: I0219 21:30:21.511036 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:21 crc kubenswrapper[4795]: E0219 21:30:21.511262 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:22 crc kubenswrapper[4795]: I0219 21:30:22.510867 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:22 crc kubenswrapper[4795]: I0219 21:30:22.510920 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:22 crc kubenswrapper[4795]: I0219 21:30:22.510981 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:22 crc kubenswrapper[4795]: E0219 21:30:22.511028 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:22 crc kubenswrapper[4795]: E0219 21:30:22.511102 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:22 crc kubenswrapper[4795]: E0219 21:30:22.511239 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:23 crc kubenswrapper[4795]: I0219 21:30:23.510922 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:23 crc kubenswrapper[4795]: E0219 21:30:23.511130 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff4bs" podUID="1b1b4346-e02e-4614-b2ff-e4628046a92f" Feb 19 21:30:24 crc kubenswrapper[4795]: I0219 21:30:24.511599 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:24 crc kubenswrapper[4795]: I0219 21:30:24.511599 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:24 crc kubenswrapper[4795]: I0219 21:30:24.511622 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:24 crc kubenswrapper[4795]: E0219 21:30:24.511863 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 21:30:24 crc kubenswrapper[4795]: E0219 21:30:24.511962 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 21:30:24 crc kubenswrapper[4795]: E0219 21:30:24.511906 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 21:30:25 crc kubenswrapper[4795]: I0219 21:30:25.511052 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:25 crc kubenswrapper[4795]: I0219 21:30:25.514460 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 21:30:25 crc kubenswrapper[4795]: I0219 21:30:25.518869 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 21:30:26 crc kubenswrapper[4795]: I0219 21:30:26.510580 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:26 crc kubenswrapper[4795]: I0219 21:30:26.510665 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:26 crc kubenswrapper[4795]: I0219 21:30:26.510665 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:26 crc kubenswrapper[4795]: I0219 21:30:26.512926 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 21:30:26 crc kubenswrapper[4795]: I0219 21:30:26.513020 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 21:30:26 crc kubenswrapper[4795]: I0219 21:30:26.513055 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 21:30:26 crc kubenswrapper[4795]: I0219 21:30:26.514489 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.609885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.650811 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.651218 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.652951 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.653987 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qks97"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.654774 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7ph8l"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.655254 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.655644 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.655924 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.656036 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.656900 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t48rm"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.657002 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.657019 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.657030 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.657073 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.658539 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.659462 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.660029 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.659618 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.659690 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.664743 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pb7s7"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.659746 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.665763 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.669913 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.670252 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.670282 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.670326 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.685060 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.685208 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.687207 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.689107 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.690202 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.690798 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.691599 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.694329 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.695519 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.695619 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.695759 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.696058 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.698849 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.699345 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.699602 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.699836 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700040 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700342 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700339 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700343 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700629 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700636 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700896 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700841 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700943 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.700888 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.701016 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.701065 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.701114 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.701227 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.701251 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.701353 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.701496 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.703566 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bks74"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.704243 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rvkhj"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.704745 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-rt6qz"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.704848 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.704923 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.704990 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705089 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705110 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705207 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705281 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705331 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705390 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705284 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-rt6qz" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705488 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.705587 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.718747 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.719102 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.719386 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.719468 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.719502 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.725703 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.726392 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7mzng"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.726952 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.722045 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.727455 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.726313 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.727738 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.728095 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.729120 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.729279 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.729426 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.729532 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.729579 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.745911 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.747466 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8l99c"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.749012 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.749801 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.745992 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.751031 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.751378 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.746802 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.747713 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.748700 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.762144 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763387 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-console-config\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763411 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-encryption-config\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763435 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ed6485ab-c517-41cd-a755-d5dc9557456b-node-pullsecrets\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763457 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763480 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/958d08af-86cf-467f-947b-4485163fd695-machine-approver-tls\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763499 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763518 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w4sp\" (UniqueName: \"kubernetes.io/projected/96e1b4b4-8d48-4955-b756-71d21a5aea0b-kube-api-access-4w4sp\") pod \"downloads-7954f5f757-rt6qz\" (UID: \"96e1b4b4-8d48-4955-b756-71d21a5aea0b\") " pod="openshift-console/downloads-7954f5f757-rt6qz" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763558 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-serving-cert\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763606 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763632 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28c04c3-66f1-4c29-b7d1-cac5aa342370-config\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763652 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f28c04c3-66f1-4c29-b7d1-cac5aa342370-serving-cert\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763672 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/958d08af-86cf-467f-947b-4485163fd695-config\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763689 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-config\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763709 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763729 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p5td\" (UniqueName: \"kubernetes.io/projected/ec60d287-0f21-467c-8030-84b8726af567-kube-api-access-8p5td\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763747 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-config\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763766 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cxdq\" (UniqueName: \"kubernetes.io/projected/958d08af-86cf-467f-947b-4485163fd695-kube-api-access-6cxdq\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763789 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xv7l\" (UniqueName: \"kubernetes.io/projected/f28c04c3-66f1-4c29-b7d1-cac5aa342370-kube-api-access-8xv7l\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763868 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763889 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-trusted-ca-bundle\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763919 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-etcd-client\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763942 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763963 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-546h9\" (UniqueName: \"kubernetes.io/projected/102f7fb5-3031-4853-b112-2aa910aa63a7-kube-api-access-546h9\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.763984 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed6485ab-c517-41cd-a755-d5dc9557456b-audit-dir\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764002 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764024 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-images\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e811d29-20d6-4576-be0f-dc59cf11b497-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764064 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-oauth-config\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764084 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86ffb50f-47f6-47b2-9141-1de9999a13e0-serving-cert\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764106 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-service-ca-bundle\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764127 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-audit\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764148 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764186 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qgn8\" (UniqueName: \"kubernetes.io/projected/e736a4ca-7c18-423c-a5de-aeafd8d6a42e-kube-api-access-6qgn8\") pod \"cluster-samples-operator-665b6dd947-22qsd\" (UID: \"e736a4ca-7c18-423c-a5de-aeafd8d6a42e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764207 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-config\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764227 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764244 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxfbg\" (UniqueName: \"kubernetes.io/projected/9de314c5-1440-476b-b98b-7804f5d95145-kube-api-access-cxfbg\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764260 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f28c04c3-66f1-4c29-b7d1-cac5aa342370-trusted-ca\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764275 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-config\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764288 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjzvv\" (UniqueName: \"kubernetes.io/projected/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-kube-api-access-qjzvv\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764303 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764319 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7n56\" (UniqueName: \"kubernetes.io/projected/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-kube-api-access-r7n56\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764336 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764352 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9de314c5-1440-476b-b98b-7804f5d95145-audit-dir\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764381 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnzj5\" (UniqueName: \"kubernetes.io/projected/ed6485ab-c517-41cd-a755-d5dc9557456b-kube-api-access-rnzj5\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764431 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764455 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbbjp\" (UniqueName: \"kubernetes.io/projected/4e811d29-20d6-4576-be0f-dc59cf11b497-kube-api-access-tbbjp\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764474 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-oauth-serving-cert\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764495 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-serving-cert\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764513 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-etcd-client\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764535 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rvcj\" (UniqueName: \"kubernetes.io/projected/86ffb50f-47f6-47b2-9141-1de9999a13e0-kube-api-access-4rvcj\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764556 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764577 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3d1887a7-d8a5-45f6-97fc-c32a870089ef-audit-dir\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764596 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/958d08af-86cf-467f-947b-4485163fd695-auth-proxy-config\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764615 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-encryption-config\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764636 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-config\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764665 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-audit-policies\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764684 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-serving-cert\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764704 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-service-ca\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764724 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-etcd-serving-ca\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764744 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-image-import-ca\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764763 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-audit-policies\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764783 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e811d29-20d6-4576-be0f-dc59cf11b497-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764806 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e736a4ca-7c18-423c-a5de-aeafd8d6a42e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-22qsd\" (UID: \"e736a4ca-7c18-423c-a5de-aeafd8d6a42e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-client-ca\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764874 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-serving-cert\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/102f7fb5-3031-4853-b112-2aa910aa63a7-serving-cert\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764909 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czvcn\" (UniqueName: \"kubernetes.io/projected/3d1887a7-d8a5-45f6-97fc-c32a870089ef-kube-api-access-czvcn\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764929 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-client-ca\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764948 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c8ll\" (UniqueName: \"kubernetes.io/projected/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-kube-api-access-4c8ll\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.764966 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.765971 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.766596 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4h49m"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.767152 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.767742 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.769069 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.769946 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.770075 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.769082 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.769091 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.769101 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.769114 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.770686 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.770712 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.772528 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.773393 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.773711 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.774622 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.774773 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.774811 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.775251 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.778924 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.778968 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-r8dcx"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.779473 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.779894 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.780499 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.783920 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sd5jz"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.784302 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.784662 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.785033 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.785426 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.785599 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.787746 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.788138 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ggndt"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.788779 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.789773 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.790271 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.793265 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.793951 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.805119 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.805331 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.812688 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.812699 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.812846 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.813021 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.814359 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.814913 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.815441 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.815976 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.816072 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.816231 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.816362 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.816532 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.816794 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.816879 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.816952 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.817323 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.817540 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.827937 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.828353 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.828717 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qks97"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.828798 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.829740 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.844337 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bks74"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.844407 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5qppq"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.844506 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.844767 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.846454 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.846869 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.847982 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.849094 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7mzng"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.851443 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.851494 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.853096 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.855191 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.858475 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.859564 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.860516 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.894928 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895797 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-audit\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895831 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qgn8\" (UniqueName: \"kubernetes.io/projected/e736a4ca-7c18-423c-a5de-aeafd8d6a42e-kube-api-access-6qgn8\") pod \"cluster-samples-operator-665b6dd947-22qsd\" (UID: \"e736a4ca-7c18-423c-a5de-aeafd8d6a42e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895851 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-config\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895867 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895882 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895897 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f28c04c3-66f1-4c29-b7d1-cac5aa342370-trusted-ca\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895910 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-config\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895926 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjzvv\" (UniqueName: \"kubernetes.io/projected/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-kube-api-access-qjzvv\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895935 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wp452"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896517 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895941 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896603 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxfbg\" (UniqueName: \"kubernetes.io/projected/9de314c5-1440-476b-b98b-7804f5d95145-kube-api-access-cxfbg\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896624 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7n56\" (UniqueName: \"kubernetes.io/projected/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-kube-api-access-r7n56\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896641 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896661 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9de314c5-1440-476b-b98b-7804f5d95145-audit-dir\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896697 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnzj5\" (UniqueName: \"kubernetes.io/projected/ed6485ab-c517-41cd-a755-d5dc9557456b-kube-api-access-rnzj5\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896712 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896729 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbbjp\" (UniqueName: \"kubernetes.io/projected/4e811d29-20d6-4576-be0f-dc59cf11b497-kube-api-access-tbbjp\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896745 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-oauth-serving-cert\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896761 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-serving-cert\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896776 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-etcd-client\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896791 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3d1887a7-d8a5-45f6-97fc-c32a870089ef-audit-dir\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896806 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rvcj\" (UniqueName: \"kubernetes.io/projected/86ffb50f-47f6-47b2-9141-1de9999a13e0-kube-api-access-4rvcj\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896822 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896840 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-config\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896853 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/958d08af-86cf-467f-947b-4485163fd695-auth-proxy-config\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896869 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-encryption-config\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896890 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-audit-policies\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896904 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-serving-cert\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896921 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-service-ca\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896937 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-etcd-serving-ca\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896951 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-image-import-ca\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896967 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-audit-policies\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896983 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e736a4ca-7c18-423c-a5de-aeafd8d6a42e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-22qsd\" (UID: \"e736a4ca-7c18-423c-a5de-aeafd8d6a42e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897000 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897015 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e811d29-20d6-4576-be0f-dc59cf11b497-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897031 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-client-ca\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897046 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/102f7fb5-3031-4853-b112-2aa910aa63a7-serving-cert\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897064 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czvcn\" (UniqueName: \"kubernetes.io/projected/3d1887a7-d8a5-45f6-97fc-c32a870089ef-kube-api-access-czvcn\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897078 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-client-ca\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897094 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c8ll\" (UniqueName: \"kubernetes.io/projected/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-kube-api-access-4c8ll\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897108 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897123 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-serving-cert\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897140 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897154 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-console-config\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897185 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-encryption-config\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897202 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ed6485ab-c517-41cd-a755-d5dc9557456b-node-pullsecrets\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897218 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897235 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/958d08af-86cf-467f-947b-4485163fd695-machine-approver-tls\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897251 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w4sp\" (UniqueName: \"kubernetes.io/projected/96e1b4b4-8d48-4955-b756-71d21a5aea0b-kube-api-access-4w4sp\") pod \"downloads-7954f5f757-rt6qz\" (UID: \"96e1b4b4-8d48-4955-b756-71d21a5aea0b\") " pod="openshift-console/downloads-7954f5f757-rt6qz" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897281 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897297 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-serving-cert\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897318 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfw9n"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897344 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897360 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f28c04c3-66f1-4c29-b7d1-cac5aa342370-serving-cert\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897379 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28c04c3-66f1-4c29-b7d1-cac5aa342370-config\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897395 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/958d08af-86cf-467f-947b-4485163fd695-config\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897409 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-config\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897425 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p5td\" (UniqueName: \"kubernetes.io/projected/ec60d287-0f21-467c-8030-84b8726af567-kube-api-access-8p5td\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897457 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897477 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897495 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xv7l\" (UniqueName: \"kubernetes.io/projected/f28c04c3-66f1-4c29-b7d1-cac5aa342370-kube-api-access-8xv7l\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897510 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-config\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cxdq\" (UniqueName: \"kubernetes.io/projected/958d08af-86cf-467f-947b-4485163fd695-kube-api-access-6cxdq\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897551 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-etcd-client\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897565 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897581 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-trusted-ca-bundle\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897596 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897615 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-546h9\" (UniqueName: \"kubernetes.io/projected/102f7fb5-3031-4853-b112-2aa910aa63a7-kube-api-access-546h9\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897630 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed6485ab-c517-41cd-a755-d5dc9557456b-audit-dir\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897646 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897662 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-oauth-config\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897679 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86ffb50f-47f6-47b2-9141-1de9999a13e0-serving-cert\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897694 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-images\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897709 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e811d29-20d6-4576-be0f-dc59cf11b497-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897724 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-service-ca-bundle\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.897757 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f28c04c3-66f1-4c29-b7d1-cac5aa342370-trusted-ca\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.898016 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.898242 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-config\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.895910 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.898404 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-service-ca-bundle\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.896621 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.898767 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.898897 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.899121 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-oauth-serving-cert\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.899664 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.900092 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pb7s7"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.902043 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.903339 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-serving-cert\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.905762 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/958d08af-86cf-467f-947b-4485163fd695-machine-approver-tls\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.906867 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e736a4ca-7c18-423c-a5de-aeafd8d6a42e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-22qsd\" (UID: \"e736a4ca-7c18-423c-a5de-aeafd8d6a42e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.906964 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3d1887a7-d8a5-45f6-97fc-c32a870089ef-audit-dir\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.907156 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f28c04c3-66f1-4c29-b7d1-cac5aa342370-serving-cert\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.907347 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-etcd-client\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.908638 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-config\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.908633 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.909545 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.910248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.910476 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-config\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.910972 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e811d29-20d6-4576-be0f-dc59cf11b497-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.911100 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/958d08af-86cf-467f-947b-4485163fd695-auth-proxy-config\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.911238 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-serving-cert\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.911602 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-client-ca\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.911619 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.912256 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.912302 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28c04c3-66f1-4c29-b7d1-cac5aa342370-config\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.912702 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/958d08af-86cf-467f-947b-4485163fd695-config\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.912789 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.912852 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9de314c5-1440-476b-b98b-7804f5d95145-audit-dir\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.913455 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4h49m"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.914708 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-config\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.915628 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.904281 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-audit\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.915709 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/102f7fb5-3031-4853-b112-2aa910aa63a7-serving-cert\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.916011 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-config\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.916469 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ed6485ab-c517-41cd-a755-d5dc9557456b-audit-dir\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.916513 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.917040 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.917083 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-client-ca\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.917617 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-service-ca\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.917967 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-trusted-ca-bundle\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.918041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-serving-cert\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.918472 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3d1887a7-d8a5-45f6-97fc-c32a870089ef-audit-policies\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.918907 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-console-config\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.919043 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.919253 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.919561 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.919663 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.920358 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.920571 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.921602 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-serving-cert\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.921754 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.921826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-image-import-ca\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.922243 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-audit-policies\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.922298 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ed6485ab-c517-41cd-a755-d5dc9557456b-node-pullsecrets\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.922573 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-images\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.922718 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-oauth-config\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.923138 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ed6485ab-c517-41cd-a755-d5dc9557456b-etcd-serving-ca\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.923903 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86ffb50f-47f6-47b2-9141-1de9999a13e0-serving-cert\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.924393 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-etcd-client\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.924967 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.926774 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.928469 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.929117 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e811d29-20d6-4576-be0f-dc59cf11b497-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.930203 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3d1887a7-d8a5-45f6-97fc-c32a870089ef-encryption-config\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.930262 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.930325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.930710 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ed6485ab-c517-41cd-a755-d5dc9557456b-encryption-config\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.930752 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sd5jz"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.932027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.934493 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.935507 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hxxkd"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.936515 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.937397 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.939395 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-c6n7h"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.940178 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.943239 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rvkhj"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.943271 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t48rm"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.944199 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.945426 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.946388 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.947349 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.948331 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-rt6qz"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.949348 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7ph8l"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.950368 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hxxkd"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.951372 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.952346 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7m8cj"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.952888 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.953184 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7m8cj" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.953390 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-st76p"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.953988 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.955870 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ggndt"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.957119 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.959002 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.959955 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8l99c"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.962836 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.963944 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.965465 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.966678 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.968246 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.969683 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7m8cj"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.970827 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.972660 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.981577 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.982710 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-c6n7h"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.983941 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.985008 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wp452"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.986124 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5qppq"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.987026 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfw9n"] Feb 19 21:30:27 crc kubenswrapper[4795]: I0219 21:30:27.993380 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.013487 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.033091 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.052995 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.073210 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.092591 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.113353 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.133086 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.153047 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.173980 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.194398 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.212954 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.233716 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.252930 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.273555 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.293287 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.314207 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.333119 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.353717 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.374113 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.392977 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.413195 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.434072 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.453850 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.473562 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.493612 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.514261 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.533439 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.554330 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.574387 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.593751 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.613972 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.633329 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.652775 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.673855 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.692919 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.714057 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.733480 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.753070 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.774035 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.794009 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.813110 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.832046 4795 request.go:700] Waited for 1.014814646s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/secrets?fieldSelector=metadata.name%3Dkube-storage-version-migrator-operator-dockercfg-2bh8d&limit=500&resourceVersion=0 Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.833653 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.853731 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.873711 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.893046 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.913373 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.953060 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.973900 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 21:30:28 crc kubenswrapper[4795]: I0219 21:30:28.993901 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.013917 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.053097 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.073426 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.093506 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.113456 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.133766 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.153417 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.173643 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.194458 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.213952 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.234450 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.254376 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.274067 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.293972 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.314023 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.333387 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.353262 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.391833 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.410869 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxfbg\" (UniqueName: \"kubernetes.io/projected/9de314c5-1440-476b-b98b-7804f5d95145-kube-api-access-cxfbg\") pod \"oauth-openshift-558db77b4-pb7s7\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.431724 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7n56\" (UniqueName: \"kubernetes.io/projected/85a5ad8f-8c61-4e28-8e23-9a51e7796e37-kube-api-access-r7n56\") pod \"authentication-operator-69f744f599-qks97\" (UID: \"85a5ad8f-8c61-4e28-8e23-9a51e7796e37\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.433380 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.455964 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.472123 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjzvv\" (UniqueName: \"kubernetes.io/projected/ab78fbf6-65df-4306-a7b8-c7bd98cfdf49-kube-api-access-qjzvv\") pod \"machine-api-operator-5694c8668f-7mzng\" (UID: \"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.488016 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbbjp\" (UniqueName: \"kubernetes.io/projected/4e811d29-20d6-4576-be0f-dc59cf11b497-kube-api-access-tbbjp\") pod \"openshift-apiserver-operator-796bbdcf4f-jdtt8\" (UID: \"4e811d29-20d6-4576-be0f-dc59cf11b497\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.493921 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.497371 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.505572 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.514606 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.542621 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.555907 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.573338 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.596466 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qgn8\" (UniqueName: \"kubernetes.io/projected/e736a4ca-7c18-423c-a5de-aeafd8d6a42e-kube-api-access-6qgn8\") pod \"cluster-samples-operator-665b6dd947-22qsd\" (UID: \"e736a4ca-7c18-423c-a5de-aeafd8d6a42e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.610824 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xv7l\" (UniqueName: \"kubernetes.io/projected/f28c04c3-66f1-4c29-b7d1-cac5aa342370-kube-api-access-8xv7l\") pod \"console-operator-58897d9998-bks74\" (UID: \"f28c04c3-66f1-4c29-b7d1-cac5aa342370\") " pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.629362 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rvcj\" (UniqueName: \"kubernetes.io/projected/86ffb50f-47f6-47b2-9141-1de9999a13e0-kube-api-access-4rvcj\") pod \"controller-manager-879f6c89f-7ph8l\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.649463 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.650323 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w4sp\" (UniqueName: \"kubernetes.io/projected/96e1b4b4-8d48-4955-b756-71d21a5aea0b-kube-api-access-4w4sp\") pod \"downloads-7954f5f757-rt6qz\" (UID: \"96e1b4b4-8d48-4955-b756-71d21a5aea0b\") " pod="openshift-console/downloads-7954f5f757-rt6qz" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.652922 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pb7s7"] Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.670801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cxdq\" (UniqueName: \"kubernetes.io/projected/958d08af-86cf-467f-947b-4485163fd695-kube-api-access-6cxdq\") pod \"machine-approver-56656f9798-rxhk8\" (UID: \"958d08af-86cf-467f-947b-4485163fd695\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.683961 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7mzng"] Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.689085 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnzj5\" (UniqueName: \"kubernetes.io/projected/ed6485ab-c517-41cd-a755-d5dc9557456b-kube-api-access-rnzj5\") pod \"apiserver-76f77b778f-t48rm\" (UID: \"ed6485ab-c517-41cd-a755-d5dc9557456b\") " pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:29 crc kubenswrapper[4795]: W0219 21:30:29.690865 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab78fbf6_65df_4306_a7b8_c7bd98cfdf49.slice/crio-e08cbad049afd6e4df92a67723860d8461febe9f1e199e1df3fb5da0e93d79e9 WatchSource:0}: Error finding container e08cbad049afd6e4df92a67723860d8461febe9f1e199e1df3fb5da0e93d79e9: Status 404 returned error can't find the container with id e08cbad049afd6e4df92a67723860d8461febe9f1e199e1df3fb5da0e93d79e9 Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.727520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czvcn\" (UniqueName: \"kubernetes.io/projected/3d1887a7-d8a5-45f6-97fc-c32a870089ef-kube-api-access-czvcn\") pod \"apiserver-7bbb656c7d-5gtgb\" (UID: \"3d1887a7-d8a5-45f6-97fc-c32a870089ef\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.749761 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-546h9\" (UniqueName: \"kubernetes.io/projected/102f7fb5-3031-4853-b112-2aa910aa63a7-kube-api-access-546h9\") pod \"route-controller-manager-6576b87f9c-vtqjw\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.752266 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p5td\" (UniqueName: \"kubernetes.io/projected/ec60d287-0f21-467c-8030-84b8726af567-kube-api-access-8p5td\") pod \"console-f9d7485db-rvkhj\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.768520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c8ll\" (UniqueName: \"kubernetes.io/projected/e726eeb3-dfb2-4c3a-bea7-a5c945f25d52-kube-api-access-4c8ll\") pod \"cluster-image-registry-operator-dc59b4c8b-89r2g\" (UID: \"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.774789 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.777396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.785550 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.790832 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8"] Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.791103 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-rt6qz" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.793238 4795 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.813190 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.814331 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.826884 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.828920 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qks97"] Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.832556 4795 request.go:700] Waited for 1.892220907s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.834129 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 21:30:29 crc kubenswrapper[4795]: W0219 21:30:29.839578 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e811d29_20d6_4576_be0f_dc59cf11b497.slice/crio-a225b9dbb5829dc9a010e0cea574920e124adc579b745a974853939c6761b15b WatchSource:0}: Error finding container a225b9dbb5829dc9a010e0cea574920e124adc579b745a974853939c6761b15b: Status 404 returned error can't find the container with id a225b9dbb5829dc9a010e0cea574920e124adc579b745a974853939c6761b15b Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.853795 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.874622 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.893896 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.907435 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.907789 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd"] Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.913896 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.929257 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.933952 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.955584 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.973957 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.975635 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 21:30:29 crc kubenswrapper[4795]: I0219 21:30:29.993966 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.013152 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.066534 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.106866 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" event={"ID":"958d08af-86cf-467f-947b-4485163fd695","Type":"ContainerStarted","Data":"6ef3a2b7973f10cbab1a50172dcfa9231c814a27302bbe3f3d2d2766a6779de2"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.111962 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" event={"ID":"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49","Type":"ContainerStarted","Data":"5e9a49f9ee76f8861850372260d1dc3a6cb6e8528a5cfb294577c6562c884c7e"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.111988 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" event={"ID":"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49","Type":"ContainerStarted","Data":"e399854ac6989202c88f9416e637fc539ba3651ba6b6d92efc53ec026b8cbe64"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.111998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" event={"ID":"ab78fbf6-65df-4306-a7b8-c7bd98cfdf49","Type":"ContainerStarted","Data":"e08cbad049afd6e4df92a67723860d8461febe9f1e199e1df3fb5da0e93d79e9"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.117863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" event={"ID":"9de314c5-1440-476b-b98b-7804f5d95145","Type":"ContainerStarted","Data":"4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.117923 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.117937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" event={"ID":"9de314c5-1440-476b-b98b-7804f5d95145","Type":"ContainerStarted","Data":"704e722c476db821d8e6f00d8c80db7e6888aef51f3367193fd7b4f2cac02bc3"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.121006 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" event={"ID":"e736a4ca-7c18-423c-a5de-aeafd8d6a42e","Type":"ContainerStarted","Data":"a47acb3ecc1d10e216fd9499681158ffe4290a98dd0b6be6c113f96e5a89d287"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.123931 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/96b2274a-7361-4463-8562-1319e967066b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-94s6c\" (UID: \"96b2274a-7361-4463-8562-1319e967066b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.123966 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f61e27e-ff25-4d76-814b-ed72e576547c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124001 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7390208-03b5-4ffa-9259-f5f1d9354c52-serving-cert\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vppps\" (UniqueName: \"kubernetes.io/projected/a3a4a159-1ab3-412f-ac71-11a7a41012ea-kube-api-access-vppps\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124057 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz87x\" (UniqueName: \"kubernetes.io/projected/821fa263-1235-4a62-8818-1f41d7e77a62-kube-api-access-pz87x\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124104 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts7rd\" (UniqueName: \"kubernetes.io/projected/4491a495-df05-43ef-bff7-2438317eac71-kube-api-access-ts7rd\") pod \"dns-operator-744455d44c-ggndt\" (UID: \"4491a495-df05-43ef-bff7-2438317eac71\") " pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124120 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgtbb\" (UniqueName: \"kubernetes.io/projected/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-kube-api-access-hgtbb\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124134 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124149 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-client\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124178 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124196 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49a1bc45-62f5-45d2-a475-b2b562cd9b98-srv-cert\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124241 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4kqs\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-kube-api-access-b4kqs\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124255 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4491a495-df05-43ef-bff7-2438317eac71-metrics-tls\") pod \"dns-operator-744455d44c-ggndt\" (UID: \"4491a495-df05-43ef-bff7-2438317eac71\") " pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124279 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-registry-certificates\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124301 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/275a566e-d165-434a-95c1-9154ede6e14e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124327 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89594551-78e6-49af-9376-477cf01d2dc5-config\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124361 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/821fa263-1235-4a62-8818-1f41d7e77a62-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124385 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-config\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124400 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a4a159-1ab3-412f-ac71-11a7a41012ea-service-ca-bundle\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9be22b71-1386-4be4-a38a-6f3b97669b9c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124446 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89594551-78e6-49af-9376-477cf01d2dc5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124460 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-proxy-tls\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124475 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0da0af7f-f8f8-492d-bd44-1e81ab242a24-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-42wfj\" (UID: \"0da0af7f-f8f8-492d-bd44-1e81ab242a24\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124507 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74cgb\" (UniqueName: \"kubernetes.io/projected/116348e7-5632-4244-8ae4-f81b06c6df4d-kube-api-access-74cgb\") pod \"migrator-59844c95c7-6b8wp\" (UID: \"116348e7-5632-4244-8ae4-f81b06c6df4d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124523 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hxj5\" (UniqueName: \"kubernetes.io/projected/49a1bc45-62f5-45d2-a475-b2b562cd9b98-kube-api-access-5hxj5\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4tq7\" (UniqueName: \"kubernetes.io/projected/73ed383c-c1e3-4f47-86d3-6faa77121e28-kube-api-access-q4tq7\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124563 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9be22b71-1386-4be4-a38a-6f3b97669b9c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-images\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124674 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjl9d\" (UniqueName: \"kubernetes.io/projected/275a566e-d165-434a-95c1-9154ede6e14e-kube-api-access-pjl9d\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124710 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73ed383c-c1e3-4f47-86d3-6faa77121e28-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt2hd\" (UniqueName: \"kubernetes.io/projected/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-kube-api-access-lt2hd\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-metrics-certs\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124764 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f61e27e-ff25-4d76-814b-ed72e576547c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/275a566e-d165-434a-95c1-9154ede6e14e-trusted-ca\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124795 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f61e27e-ff25-4d76-814b-ed72e576547c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-trusted-ca\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124836 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124895 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89594551-78e6-49af-9376-477cf01d2dc5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124960 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-bound-sa-token\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.124975 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrjbp\" (UniqueName: \"kubernetes.io/projected/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-kube-api-access-nrjbp\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125001 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k8fb\" (UniqueName: \"kubernetes.io/projected/96b2274a-7361-4463-8562-1319e967066b-kube-api-access-6k8fb\") pod \"package-server-manager-789f6589d5-94s6c\" (UID: \"96b2274a-7361-4463-8562-1319e967066b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125054 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-service-ca\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125071 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swsnw\" (UniqueName: \"kubernetes.io/projected/b7390208-03b5-4ffa-9259-f5f1d9354c52-kube-api-access-swsnw\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125087 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125101 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80407681-6091-46cc-836f-757ec4d16604-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125116 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-default-certificate\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125134 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-ca\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125150 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80407681-6091-46cc-836f-757ec4d16604-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125178 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49a1bc45-62f5-45d2-a475-b2b562cd9b98-profile-collector-cert\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-stats-auth\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125210 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-proxy-tls\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125225 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/821fa263-1235-4a62-8818-1f41d7e77a62-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125268 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-registry-tls\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125283 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73ed383c-c1e3-4f47-86d3-6faa77121e28-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125313 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh8cm\" (UniqueName: \"kubernetes.io/projected/0da0af7f-f8f8-492d-bd44-1e81ab242a24-kube-api-access-hh8cm\") pod \"control-plane-machine-set-operator-78cbb6b69f-42wfj\" (UID: \"0da0af7f-f8f8-492d-bd44-1e81ab242a24\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125338 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/275a566e-d165-434a-95c1-9154ede6e14e-metrics-tls\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125353 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9be22b71-1386-4be4-a38a-6f3b97669b9c-config\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.125370 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-serving-cert\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.127839 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:30.627827569 +0000 UTC m=+141.820345433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.129570 4795 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-pb7s7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.129618 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" podUID="9de314c5-1440-476b-b98b-7804f5d95145" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.132289 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" event={"ID":"85a5ad8f-8c61-4e28-8e23-9a51e7796e37","Type":"ContainerStarted","Data":"1e9c5c5010f2025652496676ae8ef5a35367e452fc08339397f613200f034bf7"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.132367 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" event={"ID":"85a5ad8f-8c61-4e28-8e23-9a51e7796e37","Type":"ContainerStarted","Data":"009cee7f31631bbb3b8adf58cf6d07a80b4ad1881ea2472cb834a312f36c3b3b"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.135076 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" event={"ID":"4e811d29-20d6-4576-be0f-dc59cf11b497","Type":"ContainerStarted","Data":"f98138c9981b8d5841eb213d8855b1938ee68b7d61ad23adca721db8dc42dd50"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.135150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" event={"ID":"4e811d29-20d6-4576-be0f-dc59cf11b497","Type":"ContainerStarted","Data":"a225b9dbb5829dc9a010e0cea574920e124adc579b745a974853939c6761b15b"} Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226087 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz87x\" (UniqueName: \"kubernetes.io/projected/821fa263-1235-4a62-8818-1f41d7e77a62-kube-api-access-pz87x\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226422 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts7rd\" (UniqueName: \"kubernetes.io/projected/4491a495-df05-43ef-bff7-2438317eac71-kube-api-access-ts7rd\") pod \"dns-operator-744455d44c-ggndt\" (UID: \"4491a495-df05-43ef-bff7-2438317eac71\") " pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226466 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgtbb\" (UniqueName: \"kubernetes.io/projected/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-kube-api-access-hgtbb\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226494 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-client\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226520 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226550 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/adcffec1-66db-4e9b-9502-be3c9b008dde-metrics-tls\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226602 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-258mh\" (UniqueName: \"kubernetes.io/projected/9ed4dfab-8b23-46d5-a983-db2ec1371ce2-kube-api-access-258mh\") pod \"ingress-canary-7m8cj\" (UID: \"9ed4dfab-8b23-46d5-a983-db2ec1371ce2\") " pod="openshift-ingress-canary/ingress-canary-7m8cj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49a1bc45-62f5-45d2-a475-b2b562cd9b98-srv-cert\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226671 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht9nk\" (UniqueName: \"kubernetes.io/projected/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-kube-api-access-ht9nk\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226697 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-certs\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226738 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4kqs\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-kube-api-access-b4kqs\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226763 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4491a495-df05-43ef-bff7-2438317eac71-metrics-tls\") pod \"dns-operator-744455d44c-ggndt\" (UID: \"4491a495-df05-43ef-bff7-2438317eac71\") " pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226786 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-registry-certificates\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226809 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/275a566e-d165-434a-95c1-9154ede6e14e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226835 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89594551-78e6-49af-9376-477cf01d2dc5-config\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226858 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-mountpoint-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226898 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-csi-data-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226920 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htpcf\" (UniqueName: \"kubernetes.io/projected/a82f8242-05f0-48f7-8f86-cf472309b8e3-kube-api-access-htpcf\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.226963 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/821fa263-1235-4a62-8818-1f41d7e77a62-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227030 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-config\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227055 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5667f763-a535-49eb-90a2-b78f1ebad0b7-signing-key\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227093 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a4a159-1ab3-412f-ac71-11a7a41012ea-service-ca-bundle\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227115 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9be22b71-1386-4be4-a38a-6f3b97669b9c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227138 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-proxy-tls\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0da0af7f-f8f8-492d-bd44-1e81ab242a24-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-42wfj\" (UID: \"0da0af7f-f8f8-492d-bd44-1e81ab242a24\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227209 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89594551-78e6-49af-9376-477cf01d2dc5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227261 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hxj5\" (UniqueName: \"kubernetes.io/projected/49a1bc45-62f5-45d2-a475-b2b562cd9b98-kube-api-access-5hxj5\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227288 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4tq7\" (UniqueName: \"kubernetes.io/projected/73ed383c-c1e3-4f47-86d3-6faa77121e28-kube-api-access-q4tq7\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227316 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74cgb\" (UniqueName: \"kubernetes.io/projected/116348e7-5632-4244-8ae4-f81b06c6df4d-kube-api-access-74cgb\") pod \"migrator-59844c95c7-6b8wp\" (UID: \"116348e7-5632-4244-8ae4-f81b06c6df4d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227345 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9be22b71-1386-4be4-a38a-6f3b97669b9c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b11366da-6972-44ce-8e8c-151de77fa689-webhook-cert\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227429 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-images\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227452 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-675sf\" (UniqueName: \"kubernetes.io/projected/c66857b5-461d-4c11-8fa3-52cd619bba60-kube-api-access-675sf\") pod \"multus-admission-controller-857f4d67dd-5qppq\" (UID: \"c66857b5-461d-4c11-8fa3-52cd619bba60\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227491 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-srv-cert\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227514 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjl9d\" (UniqueName: \"kubernetes.io/projected/275a566e-d165-434a-95c1-9154ede6e14e-kube-api-access-pjl9d\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227542 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73ed383c-c1e3-4f47-86d3-6faa77121e28-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227565 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt2hd\" (UniqueName: \"kubernetes.io/projected/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-kube-api-access-lt2hd\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227619 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-metrics-certs\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227643 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f61e27e-ff25-4d76-814b-ed72e576547c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227668 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227693 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/275a566e-d165-434a-95c1-9154ede6e14e-trusted-ca\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227727 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f61e27e-ff25-4d76-814b-ed72e576547c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227750 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-trusted-ca\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227794 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b11366da-6972-44ce-8e8c-151de77fa689-apiservice-cert\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.227842 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:30.727822768 +0000 UTC m=+141.920340632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.227745 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228382 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8bcw\" (UniqueName: \"kubernetes.io/projected/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-kube-api-access-t8bcw\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228401 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgb98\" (UniqueName: \"kubernetes.io/projected/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-kube-api-access-zgb98\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228417 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-secret-volume\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228481 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89594551-78e6-49af-9376-477cf01d2dc5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228532 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l8c6\" (UniqueName: \"kubernetes.io/projected/5667f763-a535-49eb-90a2-b78f1ebad0b7-kube-api-access-7l8c6\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228550 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-bound-sa-token\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228567 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrjbp\" (UniqueName: \"kubernetes.io/projected/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-kube-api-access-nrjbp\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228623 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k8fb\" (UniqueName: \"kubernetes.io/projected/96b2274a-7361-4463-8562-1319e967066b-kube-api-access-6k8fb\") pod \"package-server-manager-789f6589d5-94s6c\" (UID: \"96b2274a-7361-4463-8562-1319e967066b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228642 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228662 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thnz2\" (UniqueName: \"kubernetes.io/projected/adcffec1-66db-4e9b-9502-be3c9b008dde-kube-api-access-thnz2\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228680 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82f8242-05f0-48f7-8f86-cf472309b8e3-config\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228718 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-service-ca\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swsnw\" (UniqueName: \"kubernetes.io/projected/b7390208-03b5-4ffa-9259-f5f1d9354c52-kube-api-access-swsnw\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.228752 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.229248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73ed383c-c1e3-4f47-86d3-6faa77121e28-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.233084 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-registry-certificates\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.235972 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9be22b71-1386-4be4-a38a-6f3b97669b9c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.236058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-client\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.237474 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.237675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89594551-78e6-49af-9376-477cf01d2dc5-config\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.243203 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49a1bc45-62f5-45d2-a475-b2b562cd9b98-srv-cert\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.243482 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-images\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.244277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0da0af7f-f8f8-492d-bd44-1e81ab242a24-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-42wfj\" (UID: \"0da0af7f-f8f8-492d-bd44-1e81ab242a24\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.248120 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rvkhj"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.252291 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7ph8l"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.254660 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f61e27e-ff25-4d76-814b-ed72e576547c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.255534 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/275a566e-d165-434a-95c1-9154ede6e14e-trusted-ca\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.255542 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4491a495-df05-43ef-bff7-2438317eac71-metrics-tls\") pod \"dns-operator-744455d44c-ggndt\" (UID: \"4491a495-df05-43ef-bff7-2438317eac71\") " pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.256288 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-default-certificate\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.257346 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/821fa263-1235-4a62-8818-1f41d7e77a62-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.256442 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-ca\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.257598 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3a4a159-1ab3-412f-ac71-11a7a41012ea-service-ca-bundle\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.257660 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-plugins-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.257730 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80407681-6091-46cc-836f-757ec4d16604-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.257770 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-node-bootstrap-token\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80407681-6091-46cc-836f-757ec4d16604-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258356 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49a1bc45-62f5-45d2-a475-b2b562cd9b98-profile-collector-cert\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258388 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-stats-auth\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258507 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-registration-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258720 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80407681-6091-46cc-836f-757ec4d16604-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258750 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/adcffec1-66db-4e9b-9502-be3c9b008dde-config-volume\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258806 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/821fa263-1235-4a62-8818-1f41d7e77a62-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258864 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-proxy-tls\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258886 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-socket-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258955 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.258965 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5667f763-a535-49eb-90a2-b78f1ebad0b7-signing-cabundle\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.260287 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-proxy-tls\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.260694 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/821fa263-1235-4a62-8818-1f41d7e77a62-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261082 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-trusted-ca\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ed4dfab-8b23-46d5-a983-db2ec1371ce2-cert\") pod \"ingress-canary-7m8cj\" (UID: \"9ed4dfab-8b23-46d5-a983-db2ec1371ce2\") " pod="openshift-ingress-canary/ingress-canary-7m8cj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261306 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-registry-tls\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261396 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9bnm\" (UniqueName: \"kubernetes.io/projected/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-kube-api-access-w9bnm\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261445 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73ed383c-c1e3-4f47-86d3-6faa77121e28-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261475 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261529 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh8cm\" (UniqueName: \"kubernetes.io/projected/0da0af7f-f8f8-492d-bd44-1e81ab242a24-kube-api-access-hh8cm\") pod \"control-plane-machine-set-operator-78cbb6b69f-42wfj\" (UID: \"0da0af7f-f8f8-492d-bd44-1e81ab242a24\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261713 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c66857b5-461d-4c11-8fa3-52cd619bba60-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5qppq\" (UID: \"c66857b5-461d-4c11-8fa3-52cd619bba60\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261756 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/275a566e-d165-434a-95c1-9154ede6e14e-metrics-tls\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261778 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9be22b71-1386-4be4-a38a-6f3b97669b9c-config\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261805 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-serving-cert\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261825 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-config-volume\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.261843 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvhgm\" (UniqueName: \"kubernetes.io/projected/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-kube-api-access-zvhgm\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.262701 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49a1bc45-62f5-45d2-a475-b2b562cd9b98-profile-collector-cert\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.263450 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-stats-auth\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.263456 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f61e27e-ff25-4d76-814b-ed72e576547c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.263481 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/96b2274a-7361-4463-8562-1319e967066b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-94s6c\" (UID: \"96b2274a-7361-4463-8562-1319e967066b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.263519 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xckwg\" (UniqueName: \"kubernetes.io/projected/b11366da-6972-44ce-8e8c-151de77fa689-kube-api-access-xckwg\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.264914 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a82f8242-05f0-48f7-8f86-cf472309b8e3-serving-cert\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.264972 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7390208-03b5-4ffa-9259-f5f1d9354c52-serving-cert\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.264998 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b11366da-6972-44ce-8e8c-151de77fa689-tmpfs\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.265073 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vppps\" (UniqueName: \"kubernetes.io/projected/a3a4a159-1ab3-412f-ac71-11a7a41012ea-kube-api-access-vppps\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.266036 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-ca\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.266453 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9be22b71-1386-4be4-a38a-6f3b97669b9c-config\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.266542 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-etcd-service-ca\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.266621 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t48rm"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.266966 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80407681-6091-46cc-836f-757ec4d16604-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.267424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-registry-tls\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.267675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89594551-78e6-49af-9376-477cf01d2dc5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.269247 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/96b2274a-7361-4463-8562-1319e967066b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-94s6c\" (UID: \"96b2274a-7361-4463-8562-1319e967066b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.269469 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7390208-03b5-4ffa-9259-f5f1d9354c52-config\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.270066 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f61e27e-ff25-4d76-814b-ed72e576547c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.270997 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73ed383c-c1e3-4f47-86d3-6faa77121e28-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.271648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-default-certificate\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.274379 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hxj5\" (UniqueName: \"kubernetes.io/projected/49a1bc45-62f5-45d2-a475-b2b562cd9b98-kube-api-access-5hxj5\") pod \"catalog-operator-68c6474976-5t4db\" (UID: \"49a1bc45-62f5-45d2-a475-b2b562cd9b98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.274771 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-proxy-tls\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.275121 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-serving-cert\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.275419 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3a4a159-1ab3-412f-ac71-11a7a41012ea-metrics-certs\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.278224 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7390208-03b5-4ffa-9259-f5f1d9354c52-serving-cert\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.286324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/275a566e-d165-434a-95c1-9154ede6e14e-metrics-tls\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.291888 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4tq7\" (UniqueName: \"kubernetes.io/projected/73ed383c-c1e3-4f47-86d3-6faa77121e28-kube-api-access-q4tq7\") pod \"kube-storage-version-migrator-operator-b67b599dd-nxhdc\" (UID: \"73ed383c-c1e3-4f47-86d3-6faa77121e28\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.328277 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.331066 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74cgb\" (UniqueName: \"kubernetes.io/projected/116348e7-5632-4244-8ae4-f81b06c6df4d-kube-api-access-74cgb\") pod \"migrator-59844c95c7-6b8wp\" (UID: \"116348e7-5632-4244-8ae4-f81b06c6df4d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.332467 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz87x\" (UniqueName: \"kubernetes.io/projected/821fa263-1235-4a62-8818-1f41d7e77a62-kube-api-access-pz87x\") pod \"openshift-controller-manager-operator-756b6f6bc6-sf27h\" (UID: \"821fa263-1235-4a62-8818-1f41d7e77a62\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.336324 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bks74"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.346817 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-rt6qz"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.347803 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.349808 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.351480 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts7rd\" (UniqueName: \"kubernetes.io/projected/4491a495-df05-43ef-bff7-2438317eac71-kube-api-access-ts7rd\") pod \"dns-operator-744455d44c-ggndt\" (UID: \"4491a495-df05-43ef-bff7-2438317eac71\") " pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.365786 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a82f8242-05f0-48f7-8f86-cf472309b8e3-serving-cert\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.365833 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b11366da-6972-44ce-8e8c-151de77fa689-tmpfs\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.365860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/adcffec1-66db-4e9b-9502-be3c9b008dde-metrics-tls\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.365877 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht9nk\" (UniqueName: \"kubernetes.io/projected/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-kube-api-access-ht9nk\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.365893 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-258mh\" (UniqueName: \"kubernetes.io/projected/9ed4dfab-8b23-46d5-a983-db2ec1371ce2-kube-api-access-258mh\") pod \"ingress-canary-7m8cj\" (UID: \"9ed4dfab-8b23-46d5-a983-db2ec1371ce2\") " pod="openshift-ingress-canary/ingress-canary-7m8cj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.365910 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-certs\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366396 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-mountpoint-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366424 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-csi-data-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htpcf\" (UniqueName: \"kubernetes.io/projected/a82f8242-05f0-48f7-8f86-cf472309b8e3-kube-api-access-htpcf\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366459 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5667f763-a535-49eb-90a2-b78f1ebad0b7-signing-key\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b11366da-6972-44ce-8e8c-151de77fa689-webhook-cert\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366558 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-675sf\" (UniqueName: \"kubernetes.io/projected/c66857b5-461d-4c11-8fa3-52cd619bba60-kube-api-access-675sf\") pod \"multus-admission-controller-857f4d67dd-5qppq\" (UID: \"c66857b5-461d-4c11-8fa3-52cd619bba60\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-srv-cert\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366600 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-csi-data-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366620 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366657 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-mountpoint-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366657 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b11366da-6972-44ce-8e8c-151de77fa689-apiservice-cert\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366731 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8bcw\" (UniqueName: \"kubernetes.io/projected/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-kube-api-access-t8bcw\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366771 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgb98\" (UniqueName: \"kubernetes.io/projected/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-kube-api-access-zgb98\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366795 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-secret-volume\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l8c6\" (UniqueName: \"kubernetes.io/projected/5667f763-a535-49eb-90a2-b78f1ebad0b7-kube-api-access-7l8c6\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366891 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366920 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thnz2\" (UniqueName: \"kubernetes.io/projected/adcffec1-66db-4e9b-9502-be3c9b008dde-kube-api-access-thnz2\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.366942 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:30.866926788 +0000 UTC m=+142.059444742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.366971 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82f8242-05f0-48f7-8f86-cf472309b8e3-config\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367094 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-plugins-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-node-bootstrap-token\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367143 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-socket-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367204 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-registration-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367228 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/adcffec1-66db-4e9b-9502-be3c9b008dde-config-volume\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367248 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5667f763-a535-49eb-90a2-b78f1ebad0b7-signing-cabundle\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ed4dfab-8b23-46d5-a983-db2ec1371ce2-cert\") pod \"ingress-canary-7m8cj\" (UID: \"9ed4dfab-8b23-46d5-a983-db2ec1371ce2\") " pod="openshift-ingress-canary/ingress-canary-7m8cj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9bnm\" (UniqueName: \"kubernetes.io/projected/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-kube-api-access-w9bnm\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367330 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367361 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c66857b5-461d-4c11-8fa3-52cd619bba60-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5qppq\" (UID: \"c66857b5-461d-4c11-8fa3-52cd619bba60\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367380 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-config-volume\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367400 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvhgm\" (UniqueName: \"kubernetes.io/projected/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-kube-api-access-zvhgm\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.367427 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xckwg\" (UniqueName: \"kubernetes.io/projected/b11366da-6972-44ce-8e8c-151de77fa689-kube-api-access-xckwg\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.368275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-socket-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.368355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-registration-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.368796 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b11366da-6972-44ce-8e8c-151de77fa689-tmpfs\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.369093 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-config-volume\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.369329 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/adcffec1-66db-4e9b-9502-be3c9b008dde-config-volume\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.370147 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a82f8242-05f0-48f7-8f86-cf472309b8e3-serving-cert\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.370225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5667f763-a535-49eb-90a2-b78f1ebad0b7-signing-cabundle\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.370586 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-certs\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.370668 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-plugins-dir\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.371137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82f8242-05f0-48f7-8f86-cf472309b8e3-config\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.371461 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.371732 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgtbb\" (UniqueName: \"kubernetes.io/projected/b9f6ef9a-86de-4d04-ba12-30197f5c83ed-kube-api-access-hgtbb\") pod \"machine-config-controller-84d6567774-tg6hf\" (UID: \"b9f6ef9a-86de-4d04-ba12-30197f5c83ed\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.371771 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.374661 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/adcffec1-66db-4e9b-9502-be3c9b008dde-metrics-tls\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.377113 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b11366da-6972-44ce-8e8c-151de77fa689-apiservice-cert\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.377147 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-secret-volume\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.378081 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-srv-cert\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.380276 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c66857b5-461d-4c11-8fa3-52cd619bba60-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5qppq\" (UID: \"c66857b5-461d-4c11-8fa3-52cd619bba60\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.380768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.380773 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-node-bootstrap-token\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.380989 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5667f763-a535-49eb-90a2-b78f1ebad0b7-signing-key\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.382946 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b11366da-6972-44ce-8e8c-151de77fa689-webhook-cert\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.382961 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ed4dfab-8b23-46d5-a983-db2ec1371ce2-cert\") pod \"ingress-canary-7m8cj\" (UID: \"9ed4dfab-8b23-46d5-a983-db2ec1371ce2\") " pod="openshift-ingress-canary/ingress-canary-7m8cj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.390279 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89594551-78e6-49af-9376-477cf01d2dc5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6np95\" (UID: \"89594551-78e6-49af-9376-477cf01d2dc5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.407529 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4kqs\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-kube-api-access-b4kqs\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.422240 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.426488 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/275a566e-d165-434a-95c1-9154ede6e14e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.443340 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.448138 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k8fb\" (UniqueName: \"kubernetes.io/projected/96b2274a-7361-4463-8562-1319e967066b-kube-api-access-6k8fb\") pod \"package-server-manager-789f6589d5-94s6c\" (UID: \"96b2274a-7361-4463-8562-1319e967066b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.468701 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.468971 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:30.968939761 +0000 UTC m=+142.161457625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.469085 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.469550 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:30.969532866 +0000 UTC m=+142.162050740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.470845 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrjbp\" (UniqueName: \"kubernetes.io/projected/35ca1bac-5479-4e7f-80a5-c1811edc9e8e-kube-api-access-nrjbp\") pod \"openshift-config-operator-7777fb866f-8l99c\" (UID: \"35ca1bac-5479-4e7f-80a5-c1811edc9e8e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.477737 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.486493 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.487368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjl9d\" (UniqueName: \"kubernetes.io/projected/275a566e-d165-434a-95c1-9154ede6e14e-kube-api-access-pjl9d\") pod \"ingress-operator-5b745b69d9-4lbxt\" (UID: \"275a566e-d165-434a-95c1-9154ede6e14e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.498349 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.511053 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt2hd\" (UniqueName: \"kubernetes.io/projected/fee0aadb-9ec0-4e6c-bb48-27b74560a4ac-kube-api-access-lt2hd\") pod \"machine-config-operator-74547568cd-vhbxd\" (UID: \"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.512744 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.524387 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.526963 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.531774 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-bound-sa-token\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.548122 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9be22b71-1386-4be4-a38a-6f3b97669b9c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-x54th\" (UID: \"9be22b71-1386-4be4-a38a-6f3b97669b9c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.570040 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swsnw\" (UniqueName: \"kubernetes.io/projected/b7390208-03b5-4ffa-9259-f5f1d9354c52-kube-api-access-swsnw\") pod \"etcd-operator-b45778765-sd5jz\" (UID: \"b7390208-03b5-4ffa-9259-f5f1d9354c52\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.571291 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.571453 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.071441086 +0000 UTC m=+142.263958950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.571642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.572192 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.072184516 +0000 UTC m=+142.264702380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.621170 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh8cm\" (UniqueName: \"kubernetes.io/projected/0da0af7f-f8f8-492d-bd44-1e81ab242a24-kube-api-access-hh8cm\") pod \"control-plane-machine-set-operator-78cbb6b69f-42wfj\" (UID: \"0da0af7f-f8f8-492d-bd44-1e81ab242a24\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.641506 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f61e27e-ff25-4d76-814b-ed72e576547c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjnbz\" (UID: \"1f61e27e-ff25-4d76-814b-ed72e576547c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.650857 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vppps\" (UniqueName: \"kubernetes.io/projected/a3a4a159-1ab3-412f-ac71-11a7a41012ea-kube-api-access-vppps\") pod \"router-default-5444994796-r8dcx\" (UID: \"a3a4a159-1ab3-412f-ac71-11a7a41012ea\") " pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.673002 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.673502 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.17348424 +0000 UTC m=+142.366002114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.693783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-258mh\" (UniqueName: \"kubernetes.io/projected/9ed4dfab-8b23-46d5-a983-db2ec1371ce2-kube-api-access-258mh\") pod \"ingress-canary-7m8cj\" (UID: \"9ed4dfab-8b23-46d5-a983-db2ec1371ce2\") " pod="openshift-ingress-canary/ingress-canary-7m8cj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.696897 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht9nk\" (UniqueName: \"kubernetes.io/projected/3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8-kube-api-access-ht9nk\") pod \"machine-config-server-st76p\" (UID: \"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8\") " pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.713407 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.729429 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.729920 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thnz2\" (UniqueName: \"kubernetes.io/projected/adcffec1-66db-4e9b-9502-be3c9b008dde-kube-api-access-thnz2\") pod \"dns-default-c6n7h\" (UID: \"adcffec1-66db-4e9b-9502-be3c9b008dde\") " pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.742253 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8bcw\" (UniqueName: \"kubernetes.io/projected/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-kube-api-access-t8bcw\") pod \"marketplace-operator-79b997595-dfw9n\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.750609 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.756476 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.756688 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgb98\" (UniqueName: \"kubernetes.io/projected/5a3a8d91-b500-48db-9ceb-cc105b2eeb3a-kube-api-access-zgb98\") pod \"csi-hostpathplugin-hxxkd\" (UID: \"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a\") " pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.762766 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.772562 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.774987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.777274 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xckwg\" (UniqueName: \"kubernetes.io/projected/b11366da-6972-44ce-8e8c-151de77fa689-kube-api-access-xckwg\") pod \"packageserver-d55dfcdfc-pgfxb\" (UID: \"b11366da-6972-44ce-8e8c-151de77fa689\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.780336 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.280315108 +0000 UTC m=+142.472832972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.790755 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.791648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9bnm\" (UniqueName: \"kubernetes.io/projected/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-kube-api-access-w9bnm\") pod \"collect-profiles-29525610-9dlfj\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.804963 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.810055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htpcf\" (UniqueName: \"kubernetes.io/projected/a82f8242-05f0-48f7-8f86-cf472309b8e3-kube-api-access-htpcf\") pod \"service-ca-operator-777779d784-2cbkw\" (UID: \"a82f8242-05f0-48f7-8f86-cf472309b8e3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.819159 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.832248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l8c6\" (UniqueName: \"kubernetes.io/projected/5667f763-a535-49eb-90a2-b78f1ebad0b7-kube-api-access-7l8c6\") pod \"service-ca-9c57cc56f-wp452\" (UID: \"5667f763-a535-49eb-90a2-b78f1ebad0b7\") " pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.847060 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.854301 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.860534 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.863133 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-675sf\" (UniqueName: \"kubernetes.io/projected/c66857b5-461d-4c11-8fa3-52cd619bba60-kube-api-access-675sf\") pod \"multus-admission-controller-857f4d67dd-5qppq\" (UID: \"c66857b5-461d-4c11-8fa3-52cd619bba60\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.870715 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wp452" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.875903 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.876234 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.376219731 +0000 UTC m=+142.568737595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.876334 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.895836 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvhgm\" (UniqueName: \"kubernetes.io/projected/7005ed62-dcc4-4fb5-ac2b-3aba9de5708a-kube-api-access-zvhgm\") pod \"olm-operator-6b444d44fb-m4bg2\" (UID: \"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.897444 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.904087 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.910994 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7m8cj" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.921008 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-st76p" Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.973221 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h"] Feb 19 21:30:30 crc kubenswrapper[4795]: I0219 21:30:30.978419 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:30 crc kubenswrapper[4795]: E0219 21:30:30.978833 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.478820159 +0000 UTC m=+142.671338023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: W0219 21:30:31.058663 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod821fa263_1235_4a62_8818_1f41d7e77a62.slice/crio-56286eeaa421704e013da8463ab6cc556f56e9d2aed38e4b3489266c84c1b9ea WatchSource:0}: Error finding container 56286eeaa421704e013da8463ab6cc556f56e9d2aed38e4b3489266c84c1b9ea: Status 404 returned error can't find the container with id 56286eeaa421704e013da8463ab6cc556f56e9d2aed38e4b3489266c84c1b9ea Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.079375 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.080071 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.580046861 +0000 UTC m=+142.772564725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.080362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.081012 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.580997426 +0000 UTC m=+142.773515290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.134824 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.140338 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.152023 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.160052 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.161319 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" event={"ID":"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52","Type":"ContainerStarted","Data":"b9ca794a7257f326161a0498af5987554c0c1c2f210b07e97380bc7713ff16a4"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.161394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" event={"ID":"e726eeb3-dfb2-4c3a-bea7-a5c945f25d52","Type":"ContainerStarted","Data":"fd805e4fcb58a725d5225bfaf1940704f4aa9a8e3cd2c3338fff969733e167b2"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.166176 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ggndt"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.173854 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" event={"ID":"102f7fb5-3031-4853-b112-2aa910aa63a7","Type":"ContainerStarted","Data":"09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.173896 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" event={"ID":"102f7fb5-3031-4853-b112-2aa910aa63a7","Type":"ContainerStarted","Data":"f4d807af544e927e81a81905631510e6f7454a6d612cc5078b9fdd6b9b356c32"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.174622 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.176402 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r8dcx" event={"ID":"a3a4a159-1ab3-412f-ac71-11a7a41012ea","Type":"ContainerStarted","Data":"71a2c0fae3102f165644cba6fd7af15af48a7844131e7534c4403e9ec4772574"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.176537 4795 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-vtqjw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.176579 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" podUID="102f7fb5-3031-4853-b112-2aa910aa63a7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.177788 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.178283 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" event={"ID":"3d1887a7-d8a5-45f6-97fc-c32a870089ef","Type":"ContainerDied","Data":"f0cde8ac048a3cf993681070ef0d56921f0043611917efe89b9b8e0a5e28375d"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.178133 4795 generic.go:334] "Generic (PLEG): container finished" podID="3d1887a7-d8a5-45f6-97fc-c32a870089ef" containerID="f0cde8ac048a3cf993681070ef0d56921f0043611917efe89b9b8e0a5e28375d" exitCode=0 Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.178461 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" event={"ID":"3d1887a7-d8a5-45f6-97fc-c32a870089ef","Type":"ContainerStarted","Data":"c722c0dd0327b20acc81d77a82fc7b1282c9c0cf149cc086e65963352858e300"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.181659 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.182769 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.682751241 +0000 UTC m=+142.875269105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.183470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-rt6qz" event={"ID":"96e1b4b4-8d48-4955-b756-71d21a5aea0b","Type":"ContainerStarted","Data":"3dd0685b4530dd1fe342bc28f461e5de1e5f379ed3a082e2a45f0460350baa66"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.183504 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-rt6qz" event={"ID":"96e1b4b4-8d48-4955-b756-71d21a5aea0b","Type":"ContainerStarted","Data":"83a817e2970ad2a1632d8b71f15be18e56f12022d5a772355cc6f4cad1d10c52"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.183841 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-rt6qz" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.186377 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" event={"ID":"e736a4ca-7c18-423c-a5de-aeafd8d6a42e","Type":"ContainerStarted","Data":"6e0e3ff5de17f106edb0c69e22256075514d4cef13c40131bda5b1e6dc246118"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.186422 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" event={"ID":"e736a4ca-7c18-423c-a5de-aeafd8d6a42e","Type":"ContainerStarted","Data":"9df260857e4677af91be5709179f5553fea66fe8e1c2ee47b052bc4921fcb804"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.187333 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt6qz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.187374 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rt6qz" podUID="96e1b4b4-8d48-4955-b756-71d21a5aea0b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.188893 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bks74" event={"ID":"f28c04c3-66f1-4c29-b7d1-cac5aa342370","Type":"ContainerStarted","Data":"ac07608a22e647356a2a68507010c9abd136e30894df52889b17fff4c24d2ba3"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.188932 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bks74" event={"ID":"f28c04c3-66f1-4c29-b7d1-cac5aa342370","Type":"ContainerStarted","Data":"40bb6474669f29cc42e7cb33423eacd5d40546cce81b359830e2220fb1a5d2aa"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.189740 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.191120 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" event={"ID":"821fa263-1235-4a62-8818-1f41d7e77a62","Type":"ContainerStarted","Data":"56286eeaa421704e013da8463ab6cc556f56e9d2aed38e4b3489266c84c1b9ea"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.191344 4795 patch_prober.go:28] interesting pod/console-operator-58897d9998-bks74 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.191389 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bks74" podUID="f28c04c3-66f1-4c29-b7d1-cac5aa342370" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.193171 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" event={"ID":"958d08af-86cf-467f-947b-4485163fd695","Type":"ContainerStarted","Data":"2804453379dfda5aae2485de4530be2b31c99e85c377d29abc8a6728e9210670"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.193217 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" event={"ID":"958d08af-86cf-467f-947b-4485163fd695","Type":"ContainerStarted","Data":"325a439f8d758572082bb2ce43737b9bc7dabbb98d24390b4525636cbb073668"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.203502 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" event={"ID":"86ffb50f-47f6-47b2-9141-1de9999a13e0","Type":"ContainerStarted","Data":"71e5ceb34779a2773b15e3bbb07890cf9d8cbc1dd13d422577cbf3017f33a99e"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.203551 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" event={"ID":"86ffb50f-47f6-47b2-9141-1de9999a13e0","Type":"ContainerStarted","Data":"f50c3553621e34238711ac41e2e592ef162e4af963002aedb152ce56da5992e5"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.204582 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.209444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rvkhj" event={"ID":"ec60d287-0f21-467c-8030-84b8726af567","Type":"ContainerStarted","Data":"4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.209473 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rvkhj" event={"ID":"ec60d287-0f21-467c-8030-84b8726af567","Type":"ContainerStarted","Data":"01605015262ec0d283a1299b16fa7df4e9785d87441123f54856fb5a6f2abf61"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.210849 4795 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7ph8l container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.210914 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" podUID="86ffb50f-47f6-47b2-9141-1de9999a13e0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.221910 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" event={"ID":"b9f6ef9a-86de-4d04-ba12-30197f5c83ed","Type":"ContainerStarted","Data":"601826ee4b11b39daa395ad3da07b7f46f6892d67fffb97a81226278879a919c"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.224211 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed6485ab-c517-41cd-a755-d5dc9557456b" containerID="aba703f930910bb23f6018706ca395beec254504d78904c6db0462f6f7a5b4eb" exitCode=0 Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.224305 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" event={"ID":"ed6485ab-c517-41cd-a755-d5dc9557456b","Type":"ContainerDied","Data":"aba703f930910bb23f6018706ca395beec254504d78904c6db0462f6f7a5b4eb"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.224322 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" event={"ID":"ed6485ab-c517-41cd-a755-d5dc9557456b","Type":"ContainerStarted","Data":"796f3e65b9756bbc57b6b03fbd8211042c936c8c64a0f444a7aca8b40a2f60fd"} Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.244559 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:30:31 crc kubenswrapper[4795]: W0219 21:30:31.270574 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod116348e7_5632_4244_8ae4_f81b06c6df4d.slice/crio-bd9be097580ab759fac27216a30d387fcf80c02c8acc7458d5fa0d5543a8dd06 WatchSource:0}: Error finding container bd9be097580ab759fac27216a30d387fcf80c02c8acc7458d5fa0d5543a8dd06: Status 404 returned error can't find the container with id bd9be097580ab759fac27216a30d387fcf80c02c8acc7458d5fa0d5543a8dd06 Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.285356 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.292234 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.792217659 +0000 UTC m=+142.984735523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.317897 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.325489 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.338636 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.387356 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.387635 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.887617708 +0000 UTC m=+143.080135572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.388087 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.388659 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.888415809 +0000 UTC m=+143.080933673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.451652 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.454840 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8l99c"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.490449 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.491044 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:31.991022637 +0000 UTC m=+143.183540501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.535871 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-qks97" podStartSLOduration=118.535852467 podStartE2EDuration="1m58.535852467s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:31.531674468 +0000 UTC m=+142.724192332" watchObservedRunningTime="2026-02-19 21:30:31.535852467 +0000 UTC m=+142.728370331" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.572179 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" podStartSLOduration=118.572160735 podStartE2EDuration="1m58.572160735s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:31.571999711 +0000 UTC m=+142.764517575" watchObservedRunningTime="2026-02-19 21:30:31.572160735 +0000 UTC m=+142.764678599" Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.591804 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.592188 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.092156997 +0000 UTC m=+143.284674861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.625765 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.625809 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sd5jz"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.625819 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj"] Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.692267 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.692573 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.192553937 +0000 UTC m=+143.385071811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.793076 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.793672 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.293660096 +0000 UTC m=+143.486177960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.894740 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.895185 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.395160185 +0000 UTC m=+143.587678049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:31 crc kubenswrapper[4795]: I0219 21:30:31.998340 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:31 crc kubenswrapper[4795]: E0219 21:30:31.998698 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.498683107 +0000 UTC m=+143.691200971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.070217 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.091673 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.092481 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jdtt8" podStartSLOduration=119.092469015 podStartE2EDuration="1m59.092469015s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.07580698 +0000 UTC m=+143.268324844" watchObservedRunningTime="2026-02-19 21:30:32.092469015 +0000 UTC m=+143.284986879" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.100136 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.100678 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.600662639 +0000 UTC m=+143.793180503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.203605 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.203909 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.703896373 +0000 UTC m=+143.896414237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.206312 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wp452"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.218547 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw"] Feb 19 21:30:32 crc kubenswrapper[4795]: W0219 21:30:32.251100 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5667f763_a535_49eb_90a2_b78f1ebad0b7.slice/crio-d657e6145213a444981d32c25878e6c8aebbf41ec7496db4ae641bcf17c4600e WatchSource:0}: Error finding container d657e6145213a444981d32c25878e6c8aebbf41ec7496db4ae641bcf17c4600e: Status 404 returned error can't find the container with id d657e6145213a444981d32c25878e6c8aebbf41ec7496db4ae641bcf17c4600e Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.252387 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfw9n"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.253263 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" event={"ID":"73ed383c-c1e3-4f47-86d3-6faa77121e28","Type":"ContainerStarted","Data":"681975eecd2824ee2b18364c0676ca10317e27cadf641e4850de46e5b1bd2a25"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.254654 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" event={"ID":"1f61e27e-ff25-4d76-814b-ed72e576547c","Type":"ContainerStarted","Data":"3d3e2d6d25f5674899e7d1b668557321521d58f757d7282a588fd063dac3e037"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.256379 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" event={"ID":"116348e7-5632-4244-8ae4-f81b06c6df4d","Type":"ContainerStarted","Data":"bd9be097580ab759fac27216a30d387fcf80c02c8acc7458d5fa0d5543a8dd06"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.266223 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.268624 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" event={"ID":"9be22b71-1386-4be4-a38a-6f3b97669b9c","Type":"ContainerStarted","Data":"e247c0bda061b1d394a25f5efe898c368927a8ca75f985d4742ad2e630ac465f"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.270848 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" event={"ID":"b11366da-6972-44ce-8e8c-151de77fa689","Type":"ContainerStarted","Data":"53abc20ac9ab5c5e9c00220bd7928ad8a28c3b011f55813f000e904fdef09783"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.271855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" event={"ID":"49a1bc45-62f5-45d2-a475-b2b562cd9b98","Type":"ContainerStarted","Data":"e9139851c2bef4af0ae2f2c689b117f01c8077e3e83da00d8dbbcfc4f6566b25"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.280035 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" event={"ID":"89594551-78e6-49af-9376-477cf01d2dc5","Type":"ContainerStarted","Data":"a5f1bf1469e5995d24e08040279d8cd83af4d6df342a409a32ab2dc9474cb5e7"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.281004 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" event={"ID":"b7390208-03b5-4ffa-9259-f5f1d9354c52","Type":"ContainerStarted","Data":"12f4a241cedf4df064621df47c24f7aad9281d889676cd2bd4da009eecfcd016"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.282475 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" event={"ID":"275a566e-d165-434a-95c1-9154ede6e14e","Type":"ContainerStarted","Data":"483c845d4c052aa50a28451932712fe5b0a864fbc1b6a919d8f6f91b5590fef9"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.291927 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-st76p" event={"ID":"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8","Type":"ContainerStarted","Data":"f6c3de970402f5f2a706b9a0f207f80e122bf7818363fc8c9eaed089aeab076c"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.296952 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7m8cj"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.300855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" event={"ID":"96b2274a-7361-4463-8562-1319e967066b","Type":"ContainerStarted","Data":"a32167185a23f95c01f66f7d9775147e491c7807b907029f8e4f552e46ba8539"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.304342 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.304795 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.804773106 +0000 UTC m=+143.997290970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.306246 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5qppq"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.308454 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.315800 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-c6n7h"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.318245 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" event={"ID":"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac","Type":"ContainerStarted","Data":"3547188c0442dcc93380f2aecc8c910e14e5fe0d19b0fe22602f18e9a63827a2"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.337692 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" event={"ID":"0da0af7f-f8f8-492d-bd44-1e81ab242a24","Type":"ContainerStarted","Data":"6921e6d27e7dee7ceb4444fc64f8fad7f7e3c6af8d2c46d1d831032cbced4404"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.338299 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hxxkd"] Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.346658 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" event={"ID":"35ca1bac-5479-4e7f-80a5-c1811edc9e8e","Type":"ContainerStarted","Data":"a4eb8515b2b6362f98c6cdac68d937019825faa1528d1b43504d50f91d117161"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.366105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r8dcx" event={"ID":"a3a4a159-1ab3-412f-ac71-11a7a41012ea","Type":"ContainerStarted","Data":"ffa2030a5fa12af377a387f5f235d33162796cf624478203f6186832d3f12ade"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.394791 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" event={"ID":"4491a495-df05-43ef-bff7-2438317eac71","Type":"ContainerStarted","Data":"61b87cb2f15b2aa6261b9cfd630153a7503830a3de1e63bb8282a31307bf5406"} Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.397368 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt6qz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.397416 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rt6qz" podUID="96e1b4b4-8d48-4955-b756-71d21a5aea0b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.397653 4795 patch_prober.go:28] interesting pod/console-operator-58897d9998-bks74 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.397717 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bks74" podUID="f28c04c3-66f1-4c29-b7d1-cac5aa342370" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.403993 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.410857 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.415495 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:32.915480496 +0000 UTC m=+144.107998360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.424398 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7mzng" podStartSLOduration=119.424377748 podStartE2EDuration="1m59.424377748s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.388141492 +0000 UTC m=+143.580659366" watchObservedRunningTime="2026-02-19 21:30:32.424377748 +0000 UTC m=+143.616895612" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.491204 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:30:32 crc kubenswrapper[4795]: W0219 21:30:32.501120 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ed4dfab_8b23_46d5_a983_db2ec1371ce2.slice/crio-6f0f435341f7c3d5c312608d16c6301b721319ab948b919042c34efe2e1663a6 WatchSource:0}: Error finding container 6f0f435341f7c3d5c312608d16c6301b721319ab948b919042c34efe2e1663a6: Status 404 returned error can't find the container with id 6f0f435341f7c3d5c312608d16c6301b721319ab948b919042c34efe2e1663a6 Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.544530 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-r8dcx" podStartSLOduration=119.544509983 podStartE2EDuration="1m59.544509983s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.481815277 +0000 UTC m=+143.674333141" watchObservedRunningTime="2026-02-19 21:30:32.544509983 +0000 UTC m=+143.737027847" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.549602 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.560873 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.060840999 +0000 UTC m=+144.253358863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.561043 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.561923 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.061898427 +0000 UTC m=+144.254416291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.593303 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" podStartSLOduration=119.593288336 podStartE2EDuration="1m59.593288336s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.59266533 +0000 UTC m=+143.785183214" watchObservedRunningTime="2026-02-19 21:30:32.593288336 +0000 UTC m=+143.785806200" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.602012 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-89r2g" podStartSLOduration=119.601993703 podStartE2EDuration="1m59.601993703s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.548788455 +0000 UTC m=+143.741306319" watchObservedRunningTime="2026-02-19 21:30:32.601993703 +0000 UTC m=+143.794511568" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.628112 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rvkhj" podStartSLOduration=119.628094205 podStartE2EDuration="1m59.628094205s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.627687634 +0000 UTC m=+143.820205498" watchObservedRunningTime="2026-02-19 21:30:32.628094205 +0000 UTC m=+143.820612069" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.641728 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-rt6qz" podStartSLOduration=119.64171288 podStartE2EDuration="1m59.64171288s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.639598315 +0000 UTC m=+143.832116179" watchObservedRunningTime="2026-02-19 21:30:32.64171288 +0000 UTC m=+143.834230744" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.664797 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.665100 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.16508664 +0000 UTC m=+144.357604504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.699567 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-bks74" podStartSLOduration=119.69955275 podStartE2EDuration="1m59.69955275s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.698334818 +0000 UTC m=+143.890852682" watchObservedRunningTime="2026-02-19 21:30:32.69955275 +0000 UTC m=+143.892070614" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.728812 4795 csr.go:261] certificate signing request csr-42z98 is approved, waiting to be issued Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.735589 4795 csr.go:257] certificate signing request csr-42z98 is issued Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.743148 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22qsd" podStartSLOduration=119.743125587 podStartE2EDuration="1m59.743125587s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.736040212 +0000 UTC m=+143.928558076" watchObservedRunningTime="2026-02-19 21:30:32.743125587 +0000 UTC m=+143.935643441" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.757870 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.760399 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.760447 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.766655 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.767067 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.267051441 +0000 UTC m=+144.459569305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.843861 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rxhk8" podStartSLOduration=120.843846856 podStartE2EDuration="2m0.843846856s" podCreationTimestamp="2026-02-19 21:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.843165588 +0000 UTC m=+144.035683452" watchObservedRunningTime="2026-02-19 21:30:32.843846856 +0000 UTC m=+144.036364720" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.867424 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.867607 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.367578305 +0000 UTC m=+144.560096169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.867678 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.868179 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.36816055 +0000 UTC m=+144.560678414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.885426 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" podStartSLOduration=119.885407281 podStartE2EDuration="1m59.885407281s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:32.884055475 +0000 UTC m=+144.076573339" watchObservedRunningTime="2026-02-19 21:30:32.885407281 +0000 UTC m=+144.077925145" Feb 19 21:30:32 crc kubenswrapper[4795]: I0219 21:30:32.968864 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:32 crc kubenswrapper[4795]: E0219 21:30:32.969953 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.469908266 +0000 UTC m=+144.662426130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.075909 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.076197 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.57618513 +0000 UTC m=+144.768702994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.176875 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.177596 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.677581756 +0000 UTC m=+144.870099620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.283660 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.284076 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.784065145 +0000 UTC m=+144.976582999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.390925 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.391318 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.891300724 +0000 UTC m=+145.083818588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.416367 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" event={"ID":"1f61e27e-ff25-4d76-814b-ed72e576547c","Type":"ContainerStarted","Data":"8bcd435b33ea136204c8f0e8c059624b480764535edbe83ef981101bcd6f4ce9"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.425917 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-st76p" event={"ID":"3eac6f38-0d5b-4d93-bc1d-e1b8017adfe8","Type":"ContainerStarted","Data":"9cea8165b9fb5b94d36107b221ff7de0d864648d3305ffab89f15a34c5a73433"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.427471 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" event={"ID":"9be22b71-1386-4be4-a38a-6f3b97669b9c","Type":"ContainerStarted","Data":"1009d387cfe21828d5eaa8b00af7c1c54ece5e43bb881a0d6dafae5d4967d158"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.429981 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" event={"ID":"0da0af7f-f8f8-492d-bd44-1e81ab242a24","Type":"ContainerStarted","Data":"ca5407bd9e071e7e6f3aca525f18b8d4c2bd120c4fb40c28ae454708659ee98e"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.432672 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" event={"ID":"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca","Type":"ContainerStarted","Data":"79fa299dc315e3fe30e63332104d23b13faff115fe64d3c739841e3e2664edb0"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.432703 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" event={"ID":"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca","Type":"ContainerStarted","Data":"365d5c2e07de412e6c9e8f0e65078f4ceb7110e13c2cb20266daf040eaf8acbd"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.433527 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.441825 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dfw9n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.441868 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" podUID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.478674 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" event={"ID":"3d1887a7-d8a5-45f6-97fc-c32a870089ef","Type":"ContainerStarted","Data":"a8be11a5ecbdba1a373b058454c84bbf743649dbe6d8762cfdb2e2bf1d51f7b7"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.485357 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjnbz" podStartSLOduration=120.485334519 podStartE2EDuration="2m0.485334519s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.445388276 +0000 UTC m=+144.637906140" watchObservedRunningTime="2026-02-19 21:30:33.485334519 +0000 UTC m=+144.677852383" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.490993 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" event={"ID":"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a","Type":"ContainerStarted","Data":"8f02da0738e9d178691c0389499a8480ffbd2f961710f2202fa44516bf81c4c6"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.491043 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" event={"ID":"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a","Type":"ContainerStarted","Data":"169f9f502d25a3abb7242d9edf9ee834076b8950d2fb019424ade7bd23fc1053"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.492851 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.496851 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:33.996837209 +0000 UTC m=+145.189355073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.513618 4795 generic.go:334] "Generic (PLEG): container finished" podID="35ca1bac-5479-4e7f-80a5-c1811edc9e8e" containerID="95e19a883af5daf03192e0708c4543079b32d50d08aa677517a9f6338924bec4" exitCode=0 Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.525583 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" event={"ID":"35ca1bac-5479-4e7f-80a5-c1811edc9e8e","Type":"ContainerDied","Data":"95e19a883af5daf03192e0708c4543079b32d50d08aa677517a9f6338924bec4"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.529477 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" event={"ID":"4491a495-df05-43ef-bff7-2438317eac71","Type":"ContainerStarted","Data":"8521c9a65633694ba5c47d95e1712947e71138c020f2ba77a274ce7f257f522b"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.532678 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c6n7h" event={"ID":"adcffec1-66db-4e9b-9502-be3c9b008dde","Type":"ContainerStarted","Data":"77eff233c81ca07bf464ee0b564513104cf6fb1e8d21b9eff4fbfe696fb19053"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.533271 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c6n7h" event={"ID":"adcffec1-66db-4e9b-9502-be3c9b008dde","Type":"ContainerStarted","Data":"be39b5fb6d1e66cf1cc04be43ee6115a7e63a9f4b93f4c6ea7c9ca27a58dd0ea"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.548436 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-st76p" podStartSLOduration=6.548417715 podStartE2EDuration="6.548417715s" podCreationTimestamp="2026-02-19 21:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.547955933 +0000 UTC m=+144.740473797" watchObservedRunningTime="2026-02-19 21:30:33.548417715 +0000 UTC m=+144.740935579" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.548880 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x54th" podStartSLOduration=120.548875647 podStartE2EDuration="2m0.548875647s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.48961698 +0000 UTC m=+144.682134844" watchObservedRunningTime="2026-02-19 21:30:33.548875647 +0000 UTC m=+144.741393511" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.549383 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" event={"ID":"a82f8242-05f0-48f7-8f86-cf472309b8e3","Type":"ContainerStarted","Data":"73e9ab517dfe3bd917e8cb4a706967758fa3729eaf10d2d76e6118ebe151cb27"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.549442 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" event={"ID":"a82f8242-05f0-48f7-8f86-cf472309b8e3","Type":"ContainerStarted","Data":"94abb0779964b77bc9695db82705b627d6a09f50ec89fb6e0bd009a50c32336a"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.570845 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" event={"ID":"73ed383c-c1e3-4f47-86d3-6faa77121e28","Type":"ContainerStarted","Data":"86d1b0f9d34d3e67ed2159559a57b46d5af93d1e4591581042d64f48b0477702"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.591967 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" event={"ID":"b7390208-03b5-4ffa-9259-f5f1d9354c52","Type":"ContainerStarted","Data":"d5e5e4f50109a1f19e3f14be3f9a45e9fc807df79c25f4e093ae90602509eae3"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.596619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.597937 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.097921397 +0000 UTC m=+145.290439261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.600896 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-42wfj" podStartSLOduration=120.600880484 podStartE2EDuration="2m0.600880484s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.600159946 +0000 UTC m=+144.792677810" watchObservedRunningTime="2026-02-19 21:30:33.600880484 +0000 UTC m=+144.793398348" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.610227 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" event={"ID":"116348e7-5632-4244-8ae4-f81b06c6df4d","Type":"ContainerStarted","Data":"cd491d499a82aa69bb3da98d33c76fbfe67523b991060de30a6ef7599fedd161"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.610270 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" event={"ID":"116348e7-5632-4244-8ae4-f81b06c6df4d","Type":"ContainerStarted","Data":"a0559ba05ff203b441c0fb43c75de1acf6cfe4fb5ffc0c89ca80f83d757b62c6"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.632816 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" podStartSLOduration=120.632799417 podStartE2EDuration="2m0.632799417s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.632674554 +0000 UTC m=+144.825192428" watchObservedRunningTime="2026-02-19 21:30:33.632799417 +0000 UTC m=+144.825317281" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.659687 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" event={"ID":"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a","Type":"ContainerStarted","Data":"ee9e3a4801a9d28bda13d120b281173089455737a945dafa5ddd8f9905238b39"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.661294 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.676263 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" event={"ID":"89594551-78e6-49af-9376-477cf01d2dc5","Type":"ContainerStarted","Data":"7d33e8b545692ae287cea893cce811ac6b8db4d57335286c08afaf3a2c5b4415"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.680418 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" event={"ID":"96b2274a-7361-4463-8562-1319e967066b","Type":"ContainerStarted","Data":"316fbea86e860fc1f5090b074e37c6e1b9b7f3e461c7ed53f8b719ad99b89de0"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.680448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" event={"ID":"96b2274a-7361-4463-8562-1319e967066b","Type":"ContainerStarted","Data":"73b6566dab869dee9b85d42f0e46ae7dcb08a351ce23a9ea17d70d9e5449f3b1"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.680837 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.683791 4795 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-m4bg2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.683826 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" podUID="7005ed62-dcc4-4fb5-ac2b-3aba9de5708a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.684070 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" event={"ID":"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a","Type":"ContainerStarted","Data":"8cb0ebaa94d308e7f429c893028dba537b16caa1897a2e1ce60c2be387bea7db"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.695726 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" event={"ID":"c66857b5-461d-4c11-8fa3-52cd619bba60","Type":"ContainerStarted","Data":"c64024d8c4819b2555225a8768af2f8162944f51b37f9c298c87f8232cba18a9"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.695939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" event={"ID":"c66857b5-461d-4c11-8fa3-52cd619bba60","Type":"ContainerStarted","Data":"f9fdc7e0c291cf9b2f22329e019263dbbe6aa801c5e605ee9aaf41b6bfbdbf95"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.698282 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.699369 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.199355865 +0000 UTC m=+145.391873729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.700433 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-sd5jz" podStartSLOduration=120.700417972 podStartE2EDuration="2m0.700417972s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.659269868 +0000 UTC m=+144.851787732" watchObservedRunningTime="2026-02-19 21:30:33.700417972 +0000 UTC m=+144.892935836" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.700539 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2cbkw" podStartSLOduration=120.700535435 podStartE2EDuration="2m0.700535435s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.698448151 +0000 UTC m=+144.890966015" watchObservedRunningTime="2026-02-19 21:30:33.700535435 +0000 UTC m=+144.893053299" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.729779 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" podStartSLOduration=120.729765747 podStartE2EDuration="2m0.729765747s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.725888436 +0000 UTC m=+144.918406300" watchObservedRunningTime="2026-02-19 21:30:33.729765747 +0000 UTC m=+144.922283611" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.736770 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-19 21:25:32 +0000 UTC, rotation deadline is 2026-12-08 16:07:39.213325082 +0000 UTC Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.736798 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7002h37m5.476529271s for next certificate rotation Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.746401 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" event={"ID":"b9f6ef9a-86de-4d04-ba12-30197f5c83ed","Type":"ContainerStarted","Data":"0a86514fc846cf89d9663375fe844770178044662956d35d21e21865cb19e91c"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.746447 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" event={"ID":"b9f6ef9a-86de-4d04-ba12-30197f5c83ed","Type":"ContainerStarted","Data":"59df8637026a3509abce7c5c573f9d70f6008512281964698f67187d18a8d84c"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.758333 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" event={"ID":"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac","Type":"ContainerStarted","Data":"8f759ccd97553ffb79952af3bb4312047bf76086b9b0e676f54bb06778350bda"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.761142 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:33 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Feb 19 21:30:33 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:33 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.761185 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.770553 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" event={"ID":"275a566e-d165-434a-95c1-9154ede6e14e","Type":"ContainerStarted","Data":"41bd1e2fb64001b31b3537cbeb0b8f4e9d68c4a266e136ed56c8c548d3c92155"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.781432 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wp452" event={"ID":"5667f763-a535-49eb-90a2-b78f1ebad0b7","Type":"ContainerStarted","Data":"6e3321abaa556d00dd5bb4aa054f1e474b25b71bb48beab02c6a824ce6afbf16"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.781483 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wp452" event={"ID":"5667f763-a535-49eb-90a2-b78f1ebad0b7","Type":"ContainerStarted","Data":"d657e6145213a444981d32c25878e6c8aebbf41ec7496db4ae641bcf17c4600e"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.782367 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6b8wp" podStartSLOduration=120.78234473 podStartE2EDuration="2m0.78234473s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.78083369 +0000 UTC m=+144.973351554" watchObservedRunningTime="2026-02-19 21:30:33.78234473 +0000 UTC m=+144.974862594" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.783031 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7m8cj" event={"ID":"9ed4dfab-8b23-46d5-a983-db2ec1371ce2","Type":"ContainerStarted","Data":"035f03143c1dcd4a5cdd1970765872e48d09ffd08d71a34d5c15ff08da5fa9e3"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.783051 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7m8cj" event={"ID":"9ed4dfab-8b23-46d5-a983-db2ec1371ce2","Type":"ContainerStarted","Data":"6f0f435341f7c3d5c312608d16c6301b721319ab948b919042c34efe2e1663a6"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.791925 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" event={"ID":"ed6485ab-c517-41cd-a755-d5dc9557456b","Type":"ContainerStarted","Data":"c7c59f37f95c70304ef3b294ce77051edb09d011f297b891921c5ea231510190"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.799039 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" event={"ID":"821fa263-1235-4a62-8818-1f41d7e77a62","Type":"ContainerStarted","Data":"cb29f0bd525d1d4d261a058ab9d4851a3b8b426a577612388490598276033eb9"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.801095 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.802116 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.302077335 +0000 UTC m=+145.494595199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.802223 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.806908 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.30688983 +0000 UTC m=+145.499407764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.874562 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" event={"ID":"49a1bc45-62f5-45d2-a475-b2b562cd9b98","Type":"ContainerStarted","Data":"0d5b40b626fbf1ff6723285343e564ff35f9a801d6e5d76de81dafa7b485fcdd"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.875226 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.902393 4795 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-5t4db container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.902449 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" podUID="49a1bc45-62f5-45d2-a475-b2b562cd9b98" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.905421 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.905581 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.405561436 +0000 UTC m=+145.598079300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.906703 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:33 crc kubenswrapper[4795]: E0219 21:30:33.907045 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.407033954 +0000 UTC m=+145.599551818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.920662 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nxhdc" podStartSLOduration=120.920642729 podStartE2EDuration="2m0.920642729s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.915660609 +0000 UTC m=+145.108178483" watchObservedRunningTime="2026-02-19 21:30:33.920642729 +0000 UTC m=+145.113160603" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.920767 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" podStartSLOduration=33.920761652 podStartE2EDuration="33.920761652s" podCreationTimestamp="2026-02-19 21:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.846475163 +0000 UTC m=+145.038993027" watchObservedRunningTime="2026-02-19 21:30:33.920761652 +0000 UTC m=+145.113279516" Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.976229 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" event={"ID":"b11366da-6972-44ce-8e8c-151de77fa689","Type":"ContainerStarted","Data":"38e286dbef09b9d06c2af60f210cfc7b002762c25563f2396a11593922be7fae"} Feb 19 21:30:33 crc kubenswrapper[4795]: I0219 21:30:33.977242 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.000792 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" podStartSLOduration=121.000777711 podStartE2EDuration="2m1.000777711s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.946905585 +0000 UTC m=+145.139423469" watchObservedRunningTime="2026-02-19 21:30:34.000777711 +0000 UTC m=+145.193295575" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.010006 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.010695 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.510668519 +0000 UTC m=+145.703186383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.035316 4795 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pgfxb container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.035647 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" podUID="b11366da-6972-44ce-8e8c-151de77fa689" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.072515 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" podStartSLOduration=121.072497343 podStartE2EDuration="2m1.072497343s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.068056457 +0000 UTC m=+145.260574321" watchObservedRunningTime="2026-02-19 21:30:34.072497343 +0000 UTC m=+145.265015207" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.074182 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6np95" podStartSLOduration=121.074175346 podStartE2EDuration="2m1.074175346s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:33.999748034 +0000 UTC m=+145.192265898" watchObservedRunningTime="2026-02-19 21:30:34.074175346 +0000 UTC m=+145.266693210" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.115680 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tg6hf" podStartSLOduration=121.115665539 podStartE2EDuration="2m1.115665539s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.096704814 +0000 UTC m=+145.289222678" watchObservedRunningTime="2026-02-19 21:30:34.115665539 +0000 UTC m=+145.308183403" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.116708 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-wp452" podStartSLOduration=121.116703626 podStartE2EDuration="2m1.116703626s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.113945624 +0000 UTC m=+145.306463488" watchObservedRunningTime="2026-02-19 21:30:34.116703626 +0000 UTC m=+145.309221490" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.118246 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.119584 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.619569381 +0000 UTC m=+145.812087245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.160020 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" podStartSLOduration=121.160004157 podStartE2EDuration="2m1.160004157s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.159633897 +0000 UTC m=+145.352151761" watchObservedRunningTime="2026-02-19 21:30:34.160004157 +0000 UTC m=+145.352522021" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.163704 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sf27h" podStartSLOduration=121.163695993 podStartE2EDuration="2m1.163695993s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.137087998 +0000 UTC m=+145.329605862" watchObservedRunningTime="2026-02-19 21:30:34.163695993 +0000 UTC m=+145.356213847" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.186961 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" podStartSLOduration=121.1869467 podStartE2EDuration="2m1.1869467s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.18579208 +0000 UTC m=+145.378309944" watchObservedRunningTime="2026-02-19 21:30:34.1869467 +0000 UTC m=+145.379464564" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.219727 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.219871 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.719857839 +0000 UTC m=+145.912375693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.220124 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.220425 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.720417093 +0000 UTC m=+145.912934957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.234305 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7m8cj" podStartSLOduration=7.234292215 podStartE2EDuration="7.234292215s" podCreationTimestamp="2026-02-19 21:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.231311358 +0000 UTC m=+145.423829212" watchObservedRunningTime="2026-02-19 21:30:34.234292215 +0000 UTC m=+145.426810079" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.234771 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" podStartSLOduration=121.234766418 podStartE2EDuration="2m1.234766418s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.20956532 +0000 UTC m=+145.402083184" watchObservedRunningTime="2026-02-19 21:30:34.234766418 +0000 UTC m=+145.427284282" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.305249 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" podStartSLOduration=121.305236427 podStartE2EDuration="2m1.305236427s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.301645093 +0000 UTC m=+145.494162957" watchObservedRunningTime="2026-02-19 21:30:34.305236427 +0000 UTC m=+145.497754291" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.305657 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" podStartSLOduration=121.305639568 podStartE2EDuration="2m1.305639568s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:34.269454443 +0000 UTC m=+145.461972307" watchObservedRunningTime="2026-02-19 21:30:34.305639568 +0000 UTC m=+145.498157432" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.321472 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.321887 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.821872911 +0000 UTC m=+146.014390775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.423117 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.423819 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:34.923808072 +0000 UTC m=+146.116325936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.524656 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.525071 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.025056484 +0000 UTC m=+146.217574348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.627855 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.628269 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.128254998 +0000 UTC m=+146.320772862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.728516 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.729021 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.229005657 +0000 UTC m=+146.421523521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.762656 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:34 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Feb 19 21:30:34 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:34 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.762717 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.815041 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.815420 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.829771 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.830383 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.330367733 +0000 UTC m=+146.522885587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.930736 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:34 crc kubenswrapper[4795]: E0219 21:30:34.931097 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.431076151 +0000 UTC m=+146.623594015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.975071 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.975394 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.976467 4795 patch_prober.go:28] interesting pod/apiserver-76f77b778f-t48rm container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.976499 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" podUID="ed6485ab-c517-41cd-a755-d5dc9557456b" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.982834 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" event={"ID":"35ca1bac-5479-4e7f-80a5-c1811edc9e8e","Type":"ContainerStarted","Data":"baac9fd9e19418cf7815f351248782eb3254e0242609ec82088db0d409d83853"} Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.983611 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.985169 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" event={"ID":"4491a495-df05-43ef-bff7-2438317eac71","Type":"ContainerStarted","Data":"447e03980132dc620ddb750533dc7366ebeea6d6100dd620bc4d50401b5dfb1c"} Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.987377 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4lbxt" event={"ID":"275a566e-d165-434a-95c1-9154ede6e14e","Type":"ContainerStarted","Data":"6109fd1152f22fb3b193c8608c764935c95c318e0c9d1e71816f145bf442ea90"} Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.989600 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" event={"ID":"c66857b5-461d-4c11-8fa3-52cd619bba60","Type":"ContainerStarted","Data":"effc347963e485f919cb2646b27c3004a96259c88215be04078cf0db6a682e85"} Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.991259 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c6n7h" event={"ID":"adcffec1-66db-4e9b-9502-be3c9b008dde","Type":"ContainerStarted","Data":"562372e76d08c2a484afcd0157b6171d224a10266b2bfcba849a4ba8b46225f2"} Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.991632 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.992583 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" event={"ID":"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a","Type":"ContainerStarted","Data":"59d30cab330fef3987235993f1393803a98398cc08f27df1b7f764a182964aae"} Feb 19 21:30:34 crc kubenswrapper[4795]: I0219 21:30:34.994154 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhbxd" event={"ID":"fee0aadb-9ec0-4e6c-bb48-27b74560a4ac","Type":"ContainerStarted","Data":"d48d1d12d674b168b9ecf175194c27f5d1f3329e9a58960e0b5c14b09de4ca7b"} Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.002845 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" event={"ID":"7005ed62-dcc4-4fb5-ac2b-3aba9de5708a","Type":"ContainerStarted","Data":"90410357321a2ffb19fdf39a1cd4975b34f403ce0d02edc585087774898072f3"} Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.004077 4795 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-m4bg2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.004110 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" podUID="7005ed62-dcc4-4fb5-ac2b-3aba9de5708a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.008840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" event={"ID":"ed6485ab-c517-41cd-a755-d5dc9557456b","Type":"ContainerStarted","Data":"a3566673f0303cd80cc2ea7f3733afdf3ad6e57c924761ff75cb63c35e05fbeb"} Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.011802 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dfw9n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.011845 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" podUID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.037485 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.037921 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.53790652 +0000 UTC m=+146.730424384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.052456 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5t4db" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.139050 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.141168 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.641150244 +0000 UTC m=+146.833668108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.186881 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-5qppq" podStartSLOduration=122.186861697 podStartE2EDuration="2m2.186861697s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:35.184636859 +0000 UTC m=+146.377154753" watchObservedRunningTime="2026-02-19 21:30:35.186861697 +0000 UTC m=+146.379379561" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.187540 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" podStartSLOduration=122.187532425 podStartE2EDuration="2m2.187532425s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:35.093563582 +0000 UTC m=+146.286081446" watchObservedRunningTime="2026-02-19 21:30:35.187532425 +0000 UTC m=+146.380050289" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.241946 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.242353 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.742339245 +0000 UTC m=+146.934857109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.250226 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-ggndt" podStartSLOduration=122.250212531 podStartE2EDuration="2m2.250212531s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:35.24866379 +0000 UTC m=+146.441181654" watchObservedRunningTime="2026-02-19 21:30:35.250212531 +0000 UTC m=+146.442730385" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.311421 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.342750 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.343184 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.843161037 +0000 UTC m=+147.035678901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.440413 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-c6n7h" podStartSLOduration=8.440394235 podStartE2EDuration="8.440394235s" podCreationTimestamp="2026-02-19 21:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:35.3624284 +0000 UTC m=+146.554946264" watchObservedRunningTime="2026-02-19 21:30:35.440394235 +0000 UTC m=+146.632912099" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.443950 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.444348 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:35.944328437 +0000 UTC m=+147.136846301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.545230 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.545416 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.045387625 +0000 UTC m=+147.237905489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.545549 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.545831 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.045818946 +0000 UTC m=+147.238336810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.647107 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.647427 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.147396657 +0000 UTC m=+147.339914521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.647537 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.647867 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.14785933 +0000 UTC m=+147.340377194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.748396 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.748583 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.248556808 +0000 UTC m=+147.441074672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.748715 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.749030 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.2490209 +0000 UTC m=+147.441538764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.759304 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:35 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Feb 19 21:30:35 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:35 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.759364 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.850101 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.850291 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.350262932 +0000 UTC m=+147.542780796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.850383 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.850637 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.350627952 +0000 UTC m=+147.543145916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.951837 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.952037 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.452010908 +0000 UTC m=+147.644528772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:35 crc kubenswrapper[4795]: I0219 21:30:35.952262 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:35 crc kubenswrapper[4795]: E0219 21:30:35.952593 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.452580863 +0000 UTC m=+147.645098727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.015282 4795 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pgfxb container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.015326 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" podUID="b11366da-6972-44ce-8e8c-151de77fa689" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.016205 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dfw9n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.016246 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" podUID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.43:8080/healthz\": dial tcp 10.217.0.43:8080: connect: connection refused" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.020233 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m4bg2" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.029414 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gtgb" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.052788 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.053119 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.553104416 +0000 UTC m=+147.745622280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.065670 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zzmtm"] Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.066551 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.069045 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pgfxb" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.069513 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.095972 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zzmtm"] Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.155135 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.155666 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-catalog-content\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.155695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-utilities\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.155865 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krx4q\" (UniqueName: \"kubernetes.io/projected/e8c7f503-32c4-4ca2-8435-9918cae8d931-kube-api-access-krx4q\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.161612 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.661596168 +0000 UTC m=+147.854114032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.256990 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.257273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krx4q\" (UniqueName: \"kubernetes.io/projected/e8c7f503-32c4-4ca2-8435-9918cae8d931-kube-api-access-krx4q\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.257396 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-catalog-content\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.257434 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-utilities\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.257970 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-utilities\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.258089 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.758073726 +0000 UTC m=+147.950591590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.258639 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-catalog-content\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.311362 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krx4q\" (UniqueName: \"kubernetes.io/projected/e8c7f503-32c4-4ca2-8435-9918cae8d931-kube-api-access-krx4q\") pod \"certified-operators-zzmtm\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.360902 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.361241 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.861212268 +0000 UTC m=+148.053730132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.380603 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.446370 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fq62x"] Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.447492 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.461468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.461603 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.961581048 +0000 UTC m=+148.154098912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.461778 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.462077 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:36.96206998 +0000 UTC m=+148.154587844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.500941 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fq62x"] Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.563085 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.563273 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.063233461 +0000 UTC m=+148.255751325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.563602 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.563646 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.563926 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.063914648 +0000 UTC m=+148.256432512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.563667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.564120 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-catalog-content\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.564147 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.564212 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-utilities\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.564730 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.564761 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr9vx\" (UniqueName: \"kubernetes.io/projected/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-kube-api-access-jr9vx\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.568640 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.569722 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.569838 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.572889 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.664018 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c9sh5"] Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.672329 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.673883 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.674208 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr9vx\" (UniqueName: \"kubernetes.io/projected/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-kube-api-access-jr9vx\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.674302 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-catalog-content\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.674333 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-utilities\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.674688 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-utilities\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.674756 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.174744381 +0000 UTC m=+148.367262245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.675099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-catalog-content\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.675528 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c9sh5"] Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.684938 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.708025 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr9vx\" (UniqueName: \"kubernetes.io/projected/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-kube-api-access-jr9vx\") pod \"certified-operators-fq62x\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.724089 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.734097 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.740484 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.763682 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:36 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Feb 19 21:30:36 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:36 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.763738 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.774036 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.778957 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lns2m\" (UniqueName: \"kubernetes.io/projected/7ae7ca82-f2b1-4fec-9f66-732017519586-kube-api-access-lns2m\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.779000 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-utilities\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.779043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-catalog-content\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.779074 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.779379 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.279367372 +0000 UTC m=+148.471885236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.829640 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5j7b9"] Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.830525 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.847345 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5j7b9"] Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.881621 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.881846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-catalog-content\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.881929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lns2m\" (UniqueName: \"kubernetes.io/projected/7ae7ca82-f2b1-4fec-9f66-732017519586-kube-api-access-lns2m\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.881962 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-utilities\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.882359 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-utilities\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.882423 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.382409141 +0000 UTC m=+148.574927005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.882617 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-catalog-content\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.949138 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lns2m\" (UniqueName: \"kubernetes.io/projected/7ae7ca82-f2b1-4fec-9f66-732017519586-kube-api-access-lns2m\") pod \"community-operators-c9sh5\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.983024 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.983105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-catalog-content\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.983125 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9dxw\" (UniqueName: \"kubernetes.io/projected/1fa17669-dc5e-46a8-a76d-befdbc69aeed-kube-api-access-l9dxw\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:36 crc kubenswrapper[4795]: I0219 21:30:36.983144 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-utilities\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:36 crc kubenswrapper[4795]: E0219 21:30:36.983428 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.483416927 +0000 UTC m=+148.675934781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.018502 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.034559 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zzmtm"] Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.049561 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" event={"ID":"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a","Type":"ContainerStarted","Data":"5e33ec90e2349cffba1bed6b000e6ddc3fa76b571da946385658fa940b065a10"} Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.049618 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" event={"ID":"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a","Type":"ContainerStarted","Data":"5887ede1a359c126f9c483aacc645e76ff321fd23ce5eccfe6599ca7db6dc0e0"} Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.069966 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8l99c" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.084365 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.084604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-catalog-content\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.084653 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9dxw\" (UniqueName: \"kubernetes.io/projected/1fa17669-dc5e-46a8-a76d-befdbc69aeed-kube-api-access-l9dxw\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.084681 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-utilities\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.085447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-utilities\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:37 crc kubenswrapper[4795]: E0219 21:30:37.085737 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.585723998 +0000 UTC m=+148.778241862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.086054 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-catalog-content\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.101808 4795 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.123807 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9dxw\" (UniqueName: \"kubernetes.io/projected/1fa17669-dc5e-46a8-a76d-befdbc69aeed-kube-api-access-l9dxw\") pod \"community-operators-5j7b9\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.170108 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.186153 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:37 crc kubenswrapper[4795]: E0219 21:30:37.193564 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.693548162 +0000 UTC m=+148.886066026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.294990 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:37 crc kubenswrapper[4795]: E0219 21:30:37.295186 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.795148053 +0000 UTC m=+148.987665917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.295465 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:37 crc kubenswrapper[4795]: E0219 21:30:37.295875 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.795863261 +0000 UTC m=+148.988381125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.394026 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fq62x"] Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.396940 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:37 crc kubenswrapper[4795]: E0219 21:30:37.397041 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.897024452 +0000 UTC m=+149.089542316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.397325 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:37 crc kubenswrapper[4795]: E0219 21:30:37.397591 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.897583046 +0000 UTC m=+149.090100910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.479912 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c9sh5"] Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.498287 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:37 crc kubenswrapper[4795]: E0219 21:30:37.498630 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 21:30:37.998615743 +0000 UTC m=+149.191133607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.579796 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5j7b9"] Feb 19 21:30:37 crc kubenswrapper[4795]: W0219 21:30:37.596175 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fa17669_dc5e_46a8_a76d_befdbc69aeed.slice/crio-825dc9912efe54135ca23a728bf83c290bc47d9c0fb6c14bb961a4a89a196a6f WatchSource:0}: Error finding container 825dc9912efe54135ca23a728bf83c290bc47d9c0fb6c14bb961a4a89a196a6f: Status 404 returned error can't find the container with id 825dc9912efe54135ca23a728bf83c290bc47d9c0fb6c14bb961a4a89a196a6f Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.599036 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:37 crc kubenswrapper[4795]: E0219 21:30:37.599344 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 21:30:38.099333292 +0000 UTC m=+149.291851156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4h49m" (UID: "80407681-6091-46cc-836f-757ec4d16604") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.650401 4795 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-19T21:30:37.101834458Z","Handler":null,"Name":""} Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.653018 4795 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.653044 4795 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.700658 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.706464 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.761839 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:37 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Feb 19 21:30:37 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:37 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.761925 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.802565 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.806645 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.806690 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.830493 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4h49m\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:37 crc kubenswrapper[4795]: I0219 21:30:37.936944 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.051952 4795 generic.go:334] "Generic (PLEG): container finished" podID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerID="7884e76708116ab40efc68a93802f7cd42d9acd2ddce9816192d25b3558b0941" exitCode=0 Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.052013 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fq62x" event={"ID":"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7","Type":"ContainerDied","Data":"7884e76708116ab40efc68a93802f7cd42d9acd2ddce9816192d25b3558b0941"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.052346 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fq62x" event={"ID":"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7","Type":"ContainerStarted","Data":"56f8a93ddff4618883796150de8b693b1c3a76f9b5f00a99b738a48400fed9ee"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.054344 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.060538 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" event={"ID":"5a3a8d91-b500-48db-9ceb-cc105b2eeb3a","Type":"ContainerStarted","Data":"eeeda87c478c0e1dada2824cab3e810a39e2f842567d1a3bb0ad3612f1a76b4a"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.063387 4795 generic.go:334] "Generic (PLEG): container finished" podID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerID="4275386b7725c8ec761b57381bafb0a0755373bf551591680c5fa169d19954f2" exitCode=0 Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.063441 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c9sh5" event={"ID":"7ae7ca82-f2b1-4fec-9f66-732017519586","Type":"ContainerDied","Data":"4275386b7725c8ec761b57381bafb0a0755373bf551591680c5fa169d19954f2"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.063459 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c9sh5" event={"ID":"7ae7ca82-f2b1-4fec-9f66-732017519586","Type":"ContainerStarted","Data":"28fe4330f368c52acd76e8871982506eeb01297bf98866cc2f51b2139ec4aa1c"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.068547 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"27be47797c25c6a79413f9b396c516b314e3ef3560b58fd64ee67c1fe2df8d32"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.068571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a3ed799ffa6ca54f767f59fc5711fe1659ebf77723f8d0f85d3fc88c2808c6fe"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.079998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"51b6a9c659607d7c5797c52ed691c081d0bf0058928d445bdf4dcf47ab7ea3a9"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.080034 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"28fdba99ff83e7589075fc6aa68f08d877c3c4ad185af341ba92b78c15e02b95"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.083544 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"abcb30a3d78b3a1d89fac137a91a0fcba6f1c414993c128184e69434203178ba"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.083565 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"df08c48e80cceee41ffef0a876495946fd36f0962bd8df4b127cce127913da82"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.083739 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.092342 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" podStartSLOduration=11.092322229 podStartE2EDuration="11.092322229s" podCreationTimestamp="2026-02-19 21:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:38.088614522 +0000 UTC m=+149.281132386" watchObservedRunningTime="2026-02-19 21:30:38.092322229 +0000 UTC m=+149.284840103" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.095912 4795 generic.go:334] "Generic (PLEG): container finished" podID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerID="843a490c0d5b242211efc7cdd05db863b65b298c17a2a1bc1bcd8caa9584d3d3" exitCode=0 Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.096006 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzmtm" event={"ID":"e8c7f503-32c4-4ca2-8435-9918cae8d931","Type":"ContainerDied","Data":"843a490c0d5b242211efc7cdd05db863b65b298c17a2a1bc1bcd8caa9584d3d3"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.096034 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzmtm" event={"ID":"e8c7f503-32c4-4ca2-8435-9918cae8d931","Type":"ContainerStarted","Data":"1cc35dfba309639dbd91f218472ea2c5630e482b3631fbe826d36494a013aa5f"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.098029 4795 generic.go:334] "Generic (PLEG): container finished" podID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerID="ed0bfebb52fd6133757aa14996934704806932b85d024110638b142b237490a2" exitCode=0 Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.098734 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j7b9" event={"ID":"1fa17669-dc5e-46a8-a76d-befdbc69aeed","Type":"ContainerDied","Data":"ed0bfebb52fd6133757aa14996934704806932b85d024110638b142b237490a2"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.098771 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j7b9" event={"ID":"1fa17669-dc5e-46a8-a76d-befdbc69aeed","Type":"ContainerStarted","Data":"825dc9912efe54135ca23a728bf83c290bc47d9c0fb6c14bb961a4a89a196a6f"} Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.173044 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4h49m"] Feb 19 21:30:38 crc kubenswrapper[4795]: W0219 21:30:38.190405 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80407681_6091_46cc_836f_757ec4d16604.slice/crio-f75c3a05080941863a756e5365daccf1f7896cc60502e7fce756c537524233eb WatchSource:0}: Error finding container f75c3a05080941863a756e5365daccf1f7896cc60502e7fce756c537524233eb: Status 404 returned error can't find the container with id f75c3a05080941863a756e5365daccf1f7896cc60502e7fce756c537524233eb Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.230162 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bmzl7"] Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.238680 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.240463 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.250867 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmzl7"] Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.324161 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5vg2\" (UniqueName: \"kubernetes.io/projected/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-kube-api-access-m5vg2\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.324359 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-catalog-content\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.324457 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-utilities\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.425902 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-catalog-content\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.425957 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-utilities\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.426032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5vg2\" (UniqueName: \"kubernetes.io/projected/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-kube-api-access-m5vg2\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.426485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-catalog-content\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.426514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-utilities\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.446491 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5vg2\" (UniqueName: \"kubernetes.io/projected/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-kube-api-access-m5vg2\") pod \"redhat-marketplace-bmzl7\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.557340 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.622551 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v698q"] Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.626511 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.661554 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v698q"] Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.732706 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-utilities\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.732756 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-catalog-content\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.732779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnffn\" (UniqueName: \"kubernetes.io/projected/d858d3ea-6432-49a9-9b32-2e36b61c6e57-kube-api-access-hnffn\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.760685 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:38 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Feb 19 21:30:38 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:38 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.760756 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.774860 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmzl7"] Feb 19 21:30:38 crc kubenswrapper[4795]: W0219 21:30:38.806869 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12e5472a_2c4b_4b71_91fb_06c3d5fcca54.slice/crio-e5d0662258d341fed3d8a6fa8a495b8db4873d0c1396a7edd323344c9bdab748 WatchSource:0}: Error finding container e5d0662258d341fed3d8a6fa8a495b8db4873d0c1396a7edd323344c9bdab748: Status 404 returned error can't find the container with id e5d0662258d341fed3d8a6fa8a495b8db4873d0c1396a7edd323344c9bdab748 Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.833666 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-utilities\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.833711 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-catalog-content\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.833736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnffn\" (UniqueName: \"kubernetes.io/projected/d858d3ea-6432-49a9-9b32-2e36b61c6e57-kube-api-access-hnffn\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.834257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-catalog-content\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.834715 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-utilities\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.854772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnffn\" (UniqueName: \"kubernetes.io/projected/d858d3ea-6432-49a9-9b32-2e36b61c6e57-kube-api-access-hnffn\") pod \"redhat-marketplace-v698q\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:38 crc kubenswrapper[4795]: I0219 21:30:38.955930 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.110412 4795 generic.go:334] "Generic (PLEG): container finished" podID="05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a" containerID="8f02da0738e9d178691c0389499a8480ffbd2f961710f2202fa44516bf81c4c6" exitCode=0 Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.110479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" event={"ID":"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a","Type":"ContainerDied","Data":"8f02da0738e9d178691c0389499a8480ffbd2f961710f2202fa44516bf81c4c6"} Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.114205 4795 generic.go:334] "Generic (PLEG): container finished" podID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerID="7d3a5224f2284bc0e0b27ce31d0a2ef7f65f8de8a465113fd61e0813421d7fde" exitCode=0 Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.114288 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmzl7" event={"ID":"12e5472a-2c4b-4b71-91fb-06c3d5fcca54","Type":"ContainerDied","Data":"7d3a5224f2284bc0e0b27ce31d0a2ef7f65f8de8a465113fd61e0813421d7fde"} Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.114315 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmzl7" event={"ID":"12e5472a-2c4b-4b71-91fb-06c3d5fcca54","Type":"ContainerStarted","Data":"e5d0662258d341fed3d8a6fa8a495b8db4873d0c1396a7edd323344c9bdab748"} Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.153243 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" event={"ID":"80407681-6091-46cc-836f-757ec4d16604","Type":"ContainerStarted","Data":"65bbbf8046489156a597ecad5046b6cb27a3c794bdbc97d6bd9820be41cf0ce9"} Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.153293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" event={"ID":"80407681-6091-46cc-836f-757ec4d16604","Type":"ContainerStarted","Data":"f75c3a05080941863a756e5365daccf1f7896cc60502e7fce756c537524233eb"} Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.180844 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" podStartSLOduration=126.180823239 podStartE2EDuration="2m6.180823239s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:39.17857686 +0000 UTC m=+150.371094734" watchObservedRunningTime="2026-02-19 21:30:39.180823239 +0000 UTC m=+150.373341103" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.340987 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v698q"] Feb 19 21:30:39 crc kubenswrapper[4795]: W0219 21:30:39.355051 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd858d3ea_6432_49a9_9b32_2e36b61c6e57.slice/crio-28460660f6d692ba6e8f6b91806cd443070cee612e5427f6f9dabd127b7a144e WatchSource:0}: Error finding container 28460660f6d692ba6e8f6b91806cd443070cee612e5427f6f9dabd127b7a144e: Status 404 returned error can't find the container with id 28460660f6d692ba6e8f6b91806cd443070cee612e5427f6f9dabd127b7a144e Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.434496 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5tclw"] Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.436479 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.437501 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5tclw"] Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.438961 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.544517 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.545398 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-utilities\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.545553 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-catalog-content\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.545581 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ddlx\" (UniqueName: \"kubernetes.io/projected/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-kube-api-access-5ddlx\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.646607 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-catalog-content\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.646659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ddlx\" (UniqueName: \"kubernetes.io/projected/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-kube-api-access-5ddlx\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.646720 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-utilities\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.648118 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-utilities\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.648588 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-catalog-content\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.678355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ddlx\" (UniqueName: \"kubernetes.io/projected/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-kube-api-access-5ddlx\") pod \"redhat-operators-5tclw\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.698418 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.699334 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.701528 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.702065 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.705163 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.756226 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd535c51-1ece-4449-823a-cf80a095eaeb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bd535c51-1ece-4449-823a-cf80a095eaeb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.756321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd535c51-1ece-4449-823a-cf80a095eaeb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bd535c51-1ece-4449-823a-cf80a095eaeb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.757345 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.763408 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:39 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Feb 19 21:30:39 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:39 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.763799 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.784275 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-bks74" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.785809 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.785864 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.786763 4795 patch_prober.go:28] interesting pod/console-f9d7485db-rvkhj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.786828 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rvkhj" podUID="ec60d287-0f21-467c-8030-84b8726af567" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.792471 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt6qz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.792509 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rt6qz" podUID="96e1b4b4-8d48-4955-b756-71d21a5aea0b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.792515 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt6qz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.792551 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-rt6qz" podUID="96e1b4b4-8d48-4955-b756-71d21a5aea0b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.831280 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zpkx6"] Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.859566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd535c51-1ece-4449-823a-cf80a095eaeb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bd535c51-1ece-4449-823a-cf80a095eaeb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.859841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd535c51-1ece-4449-823a-cf80a095eaeb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bd535c51-1ece-4449-823a-cf80a095eaeb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.867526 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd535c51-1ece-4449-823a-cf80a095eaeb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bd535c51-1ece-4449-823a-cf80a095eaeb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.869647 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zpkx6"] Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.869769 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.917727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd535c51-1ece-4449-823a-cf80a095eaeb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bd535c51-1ece-4449-823a-cf80a095eaeb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.961790 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jst7w\" (UniqueName: \"kubernetes.io/projected/ac83daf6-848e-4977-8bb9-a7b4db89618f-kube-api-access-jst7w\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.961890 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-utilities\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:39 crc kubenswrapper[4795]: I0219 21:30:39.961915 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-catalog-content\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.008558 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.020242 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-t48rm" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.045635 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.064523 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-utilities\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.064613 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-catalog-content\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.064729 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jst7w\" (UniqueName: \"kubernetes.io/projected/ac83daf6-848e-4977-8bb9-a7b4db89618f-kube-api-access-jst7w\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.066519 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-utilities\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.066971 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-catalog-content\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.111878 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jst7w\" (UniqueName: \"kubernetes.io/projected/ac83daf6-848e-4977-8bb9-a7b4db89618f-kube-api-access-jst7w\") pod \"redhat-operators-zpkx6\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.209729 4795 generic.go:334] "Generic (PLEG): container finished" podID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerID="3f2cfda31513c4d8d601e01af9b97c3ba687e6c071cb1cb6f9eb5d2af4229073" exitCode=0 Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.210670 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v698q" event={"ID":"d858d3ea-6432-49a9-9b32-2e36b61c6e57","Type":"ContainerDied","Data":"3f2cfda31513c4d8d601e01af9b97c3ba687e6c071cb1cb6f9eb5d2af4229073"} Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.210702 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v698q" event={"ID":"d858d3ea-6432-49a9-9b32-2e36b61c6e57","Type":"ContainerStarted","Data":"28460660f6d692ba6e8f6b91806cd443070cee612e5427f6f9dabd127b7a144e"} Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.211096 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.236217 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.387204 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5tclw"] Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.582476 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.671467 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zpkx6"] Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.712254 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:40 crc kubenswrapper[4795]: W0219 21:30:40.712606 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac83daf6_848e_4977_8bb9_a7b4db89618f.slice/crio-5f603a895c6c8cc9d233f8386c8619bd99f45021b342fc0088eeb4edeed70ce6 WatchSource:0}: Error finding container 5f603a895c6c8cc9d233f8386c8619bd99f45021b342fc0088eeb4edeed70ce6: Status 404 returned error can't find the container with id 5f603a895c6c8cc9d233f8386c8619bd99f45021b342fc0088eeb4edeed70ce6 Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.757271 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.765146 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:40 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Feb 19 21:30:40 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:40 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.765765 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.792568 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-secret-volume\") pod \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.792638 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-config-volume\") pod \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.792672 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9bnm\" (UniqueName: \"kubernetes.io/projected/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-kube-api-access-w9bnm\") pod \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\" (UID: \"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a\") " Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.798452 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-config-volume" (OuterVolumeSpecName: "config-volume") pod "05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a" (UID: "05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.800232 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-kube-api-access-w9bnm" (OuterVolumeSpecName: "kube-api-access-w9bnm") pod "05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a" (UID: "05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a"). InnerVolumeSpecName "kube-api-access-w9bnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.801022 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a" (UID: "05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.882152 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.898346 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9bnm\" (UniqueName: \"kubernetes.io/projected/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-kube-api-access-w9bnm\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.898463 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:40 crc kubenswrapper[4795]: I0219 21:30:40.898473 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.222046 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerID="b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd" exitCode=0 Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.222236 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpkx6" event={"ID":"ac83daf6-848e-4977-8bb9-a7b4db89618f","Type":"ContainerDied","Data":"b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd"} Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.222463 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpkx6" event={"ID":"ac83daf6-848e-4977-8bb9-a7b4db89618f","Type":"ContainerStarted","Data":"5f603a895c6c8cc9d233f8386c8619bd99f45021b342fc0088eeb4edeed70ce6"} Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.229057 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bd535c51-1ece-4449-823a-cf80a095eaeb","Type":"ContainerStarted","Data":"8c3aef20d6522b3352b0833650c257ae48627d27a744a1ceabc69c6a6ad6e576"} Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.229081 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bd535c51-1ece-4449-823a-cf80a095eaeb","Type":"ContainerStarted","Data":"5c4de5efa33ea824be10af821058f85db139c88f413554e1f8448d076363ae4c"} Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.232230 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" event={"ID":"05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a","Type":"ContainerDied","Data":"169f9f502d25a3abb7242d9edf9ee834076b8950d2fb019424ade7bd23fc1053"} Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.232254 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="169f9f502d25a3abb7242d9edf9ee834076b8950d2fb019424ade7bd23fc1053" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.232291 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.249591 4795 generic.go:334] "Generic (PLEG): container finished" podID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerID="e32b30648a4cba474dcfdca283954a8496e4f09ea18f9bc9f3563fdbbe7453f6" exitCode=0 Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.250454 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tclw" event={"ID":"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9","Type":"ContainerDied","Data":"e32b30648a4cba474dcfdca283954a8496e4f09ea18f9bc9f3563fdbbe7453f6"} Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.250636 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tclw" event={"ID":"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9","Type":"ContainerStarted","Data":"d567b35b55d0a2cbb795271be9aef7ece38aac167cf48328a27f71f0b916ce76"} Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.273360 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.273333902 podStartE2EDuration="2.273333902s" podCreationTimestamp="2026-02-19 21:30:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:41.258328031 +0000 UTC m=+152.450845895" watchObservedRunningTime="2026-02-19 21:30:41.273333902 +0000 UTC m=+152.465851776" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.291627 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 21:30:41 crc kubenswrapper[4795]: E0219 21:30:41.291849 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a" containerName="collect-profiles" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.291865 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a" containerName="collect-profiles" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.291970 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a" containerName="collect-profiles" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.293721 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.297572 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.299848 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.314971 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.402810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.403216 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.505059 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.505111 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.505265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.520830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.612813 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.760441 4795 patch_prober.go:28] interesting pod/router-default-5444994796-r8dcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 21:30:41 crc kubenswrapper[4795]: [+]has-synced ok Feb 19 21:30:41 crc kubenswrapper[4795]: [+]process-running ok Feb 19 21:30:41 crc kubenswrapper[4795]: healthz check failed Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.760790 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r8dcx" podUID="a3a4a159-1ab3-412f-ac71-11a7a41012ea" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 21:30:41 crc kubenswrapper[4795]: I0219 21:30:41.929301 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 21:30:42 crc kubenswrapper[4795]: I0219 21:30:42.258611 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bd535c51-1ece-4449-823a-cf80a095eaeb","Type":"ContainerDied","Data":"8c3aef20d6522b3352b0833650c257ae48627d27a744a1ceabc69c6a6ad6e576"} Feb 19 21:30:42 crc kubenswrapper[4795]: I0219 21:30:42.259074 4795 generic.go:334] "Generic (PLEG): container finished" podID="bd535c51-1ece-4449-823a-cf80a095eaeb" containerID="8c3aef20d6522b3352b0833650c257ae48627d27a744a1ceabc69c6a6ad6e576" exitCode=0 Feb 19 21:30:42 crc kubenswrapper[4795]: I0219 21:30:42.261923 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6ac441c3-3e7c-482d-a324-0c383d0be8ef","Type":"ContainerStarted","Data":"aeab273dd8030b77f965cee3b847bb8f7437db9bf610dae5a29b1f66ab406354"} Feb 19 21:30:42 crc kubenswrapper[4795]: I0219 21:30:42.764028 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:42 crc kubenswrapper[4795]: I0219 21:30:42.772640 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-r8dcx" Feb 19 21:30:43 crc kubenswrapper[4795]: I0219 21:30:43.287123 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6ac441c3-3e7c-482d-a324-0c383d0be8ef","Type":"ContainerStarted","Data":"029945b018ee1703eb6b861ff838efae767f8cd12918eeea6f22170a6224219e"} Feb 19 21:30:43 crc kubenswrapper[4795]: I0219 21:30:43.306117 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.306095587 podStartE2EDuration="2.306095587s" podCreationTimestamp="2026-02-19 21:30:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:30:43.299474335 +0000 UTC m=+154.491992209" watchObservedRunningTime="2026-02-19 21:30:43.306095587 +0000 UTC m=+154.498613451" Feb 19 21:30:44 crc kubenswrapper[4795]: I0219 21:30:44.294388 4795 generic.go:334] "Generic (PLEG): container finished" podID="6ac441c3-3e7c-482d-a324-0c383d0be8ef" containerID="029945b018ee1703eb6b861ff838efae767f8cd12918eeea6f22170a6224219e" exitCode=0 Feb 19 21:30:44 crc kubenswrapper[4795]: I0219 21:30:44.294478 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6ac441c3-3e7c-482d-a324-0c383d0be8ef","Type":"ContainerDied","Data":"029945b018ee1703eb6b861ff838efae767f8cd12918eeea6f22170a6224219e"} Feb 19 21:30:45 crc kubenswrapper[4795]: I0219 21:30:45.907910 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-c6n7h" Feb 19 21:30:49 crc kubenswrapper[4795]: I0219 21:30:49.796711 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-rt6qz" Feb 19 21:30:49 crc kubenswrapper[4795]: I0219 21:30:49.841736 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:49 crc kubenswrapper[4795]: I0219 21:30:49.845967 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:30:50 crc kubenswrapper[4795]: I0219 21:30:50.767962 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:50 crc kubenswrapper[4795]: I0219 21:30:50.872230 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd535c51-1ece-4449-823a-cf80a095eaeb-kube-api-access\") pod \"bd535c51-1ece-4449-823a-cf80a095eaeb\" (UID: \"bd535c51-1ece-4449-823a-cf80a095eaeb\") " Feb 19 21:30:50 crc kubenswrapper[4795]: I0219 21:30:50.872737 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd535c51-1ece-4449-823a-cf80a095eaeb-kubelet-dir\") pod \"bd535c51-1ece-4449-823a-cf80a095eaeb\" (UID: \"bd535c51-1ece-4449-823a-cf80a095eaeb\") " Feb 19 21:30:50 crc kubenswrapper[4795]: I0219 21:30:50.872986 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd535c51-1ece-4449-823a-cf80a095eaeb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bd535c51-1ece-4449-823a-cf80a095eaeb" (UID: "bd535c51-1ece-4449-823a-cf80a095eaeb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:30:50 crc kubenswrapper[4795]: I0219 21:30:50.873127 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd535c51-1ece-4449-823a-cf80a095eaeb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:50 crc kubenswrapper[4795]: I0219 21:30:50.882370 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd535c51-1ece-4449-823a-cf80a095eaeb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bd535c51-1ece-4449-823a-cf80a095eaeb" (UID: "bd535c51-1ece-4449-823a-cf80a095eaeb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:30:50 crc kubenswrapper[4795]: I0219 21:30:50.974611 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd535c51-1ece-4449-823a-cf80a095eaeb-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.374796 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bd535c51-1ece-4449-823a-cf80a095eaeb","Type":"ContainerDied","Data":"5c4de5efa33ea824be10af821058f85db139c88f413554e1f8448d076363ae4c"} Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.374839 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c4de5efa33ea824be10af821058f85db139c88f413554e1f8448d076363ae4c" Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.374866 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.547459 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.580610 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kubelet-dir\") pod \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\" (UID: \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\") " Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.580974 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kube-api-access\") pod \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\" (UID: \"6ac441c3-3e7c-482d-a324-0c383d0be8ef\") " Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.580761 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6ac441c3-3e7c-482d-a324-0c383d0be8ef" (UID: "6ac441c3-3e7c-482d-a324-0c383d0be8ef"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.584687 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6ac441c3-3e7c-482d-a324-0c383d0be8ef" (UID: "6ac441c3-3e7c-482d-a324-0c383d0be8ef"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.683020 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:51 crc kubenswrapper[4795]: I0219 21:30:51.683049 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ac441c3-3e7c-482d-a324-0c383d0be8ef-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:30:52 crc kubenswrapper[4795]: I0219 21:30:52.380804 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6ac441c3-3e7c-482d-a324-0c383d0be8ef","Type":"ContainerDied","Data":"aeab273dd8030b77f965cee3b847bb8f7437db9bf610dae5a29b1f66ab406354"} Feb 19 21:30:52 crc kubenswrapper[4795]: I0219 21:30:52.380858 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 21:30:52 crc kubenswrapper[4795]: I0219 21:30:52.380870 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aeab273dd8030b77f965cee3b847bb8f7437db9bf610dae5a29b1f66ab406354" Feb 19 21:30:55 crc kubenswrapper[4795]: I0219 21:30:55.329657 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:55 crc kubenswrapper[4795]: I0219 21:30:55.333598 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b1b4346-e02e-4614-b2ff-e4628046a92f-metrics-certs\") pod \"network-metrics-daemon-ff4bs\" (UID: \"1b1b4346-e02e-4614-b2ff-e4628046a92f\") " pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:55 crc kubenswrapper[4795]: I0219 21:30:55.527795 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff4bs" Feb 19 21:30:57 crc kubenswrapper[4795]: I0219 21:30:57.942776 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:30:58 crc kubenswrapper[4795]: I0219 21:30:58.427477 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:30:58 crc kubenswrapper[4795]: I0219 21:30:58.427527 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:31:08 crc kubenswrapper[4795]: E0219 21:31:08.491890 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 19 21:31:08 crc kubenswrapper[4795]: E0219 21:31:08.492798 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jst7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zpkx6_openshift-marketplace(ac83daf6-848e-4977-8bb9-a7b4db89618f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 21:31:08 crc kubenswrapper[4795]: E0219 21:31:08.494118 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zpkx6" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" Feb 19 21:31:08 crc kubenswrapper[4795]: I0219 21:31:08.777421 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ff4bs"] Feb 19 21:31:08 crc kubenswrapper[4795]: W0219 21:31:08.785633 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b1b4346_e02e_4614_b2ff_e4628046a92f.slice/crio-1216a86ba52aa2b06a57fc597c8fa689ed6ddd544ad0f4709f859040348d817f WatchSource:0}: Error finding container 1216a86ba52aa2b06a57fc597c8fa689ed6ddd544ad0f4709f859040348d817f: Status 404 returned error can't find the container with id 1216a86ba52aa2b06a57fc597c8fa689ed6ddd544ad0f4709f859040348d817f Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.467107 4795 generic.go:334] "Generic (PLEG): container finished" podID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerID="90b9312e0f5408fc251999d131140fd68f045334ecdf8f5a7e0995574996fb05" exitCode=0 Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.467475 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j7b9" event={"ID":"1fa17669-dc5e-46a8-a76d-befdbc69aeed","Type":"ContainerDied","Data":"90b9312e0f5408fc251999d131140fd68f045334ecdf8f5a7e0995574996fb05"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.470131 4795 generic.go:334] "Generic (PLEG): container finished" podID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerID="2bc16809b25a9de1b1eb527dd20e661703d80bd6e75ac128fd97a218812a8320" exitCode=0 Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.470200 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzmtm" event={"ID":"e8c7f503-32c4-4ca2-8435-9918cae8d931","Type":"ContainerDied","Data":"2bc16809b25a9de1b1eb527dd20e661703d80bd6e75ac128fd97a218812a8320"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.472851 4795 generic.go:334] "Generic (PLEG): container finished" podID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerID="cb4588aca4785a571a304fb53b83b17a5d4fe720455aa15c94f9117f8cbe5514" exitCode=0 Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.472875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmzl7" event={"ID":"12e5472a-2c4b-4b71-91fb-06c3d5fcca54","Type":"ContainerDied","Data":"cb4588aca4785a571a304fb53b83b17a5d4fe720455aa15c94f9117f8cbe5514"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.475534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" event={"ID":"1b1b4346-e02e-4614-b2ff-e4628046a92f","Type":"ContainerStarted","Data":"4d69db8866e7c06900356daa727588178e6f7179fc0418bc2e66851ee0f66ee5"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.475599 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" event={"ID":"1b1b4346-e02e-4614-b2ff-e4628046a92f","Type":"ContainerStarted","Data":"9cfc570c13b0a6e9e34f1e123b7f4cab69a16b6ca9c1d2cc2e68267610fe8d15"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.475615 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ff4bs" event={"ID":"1b1b4346-e02e-4614-b2ff-e4628046a92f","Type":"ContainerStarted","Data":"1216a86ba52aa2b06a57fc597c8fa689ed6ddd544ad0f4709f859040348d817f"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.480212 4795 generic.go:334] "Generic (PLEG): container finished" podID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerID="b730d866cb5930ce6b3bd30068b74b30f568d9da17e9b40a916b2db10b9b7380" exitCode=0 Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.480237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fq62x" event={"ID":"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7","Type":"ContainerDied","Data":"b730d866cb5930ce6b3bd30068b74b30f568d9da17e9b40a916b2db10b9b7380"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.484608 4795 generic.go:334] "Generic (PLEG): container finished" podID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerID="76fd4bbcb0278e373cff16d83c7ceda3fc1a11cdd6076ee88c76e62b714b1bad" exitCode=0 Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.484698 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v698q" event={"ID":"d858d3ea-6432-49a9-9b32-2e36b61c6e57","Type":"ContainerDied","Data":"76fd4bbcb0278e373cff16d83c7ceda3fc1a11cdd6076ee88c76e62b714b1bad"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.490415 4795 generic.go:334] "Generic (PLEG): container finished" podID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerID="099f13240cf3aea087d163d3e1052bf3c9c7278f2cd6182ccc06266ec67326ae" exitCode=0 Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.490551 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c9sh5" event={"ID":"7ae7ca82-f2b1-4fec-9f66-732017519586","Type":"ContainerDied","Data":"099f13240cf3aea087d163d3e1052bf3c9c7278f2cd6182ccc06266ec67326ae"} Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.498709 4795 generic.go:334] "Generic (PLEG): container finished" podID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerID="1fec75a7b046544f542f990a4f1a07786af212b43341a6cab76f9622d84d42c4" exitCode=0 Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.499551 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tclw" event={"ID":"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9","Type":"ContainerDied","Data":"1fec75a7b046544f542f990a4f1a07786af212b43341a6cab76f9622d84d42c4"} Feb 19 21:31:09 crc kubenswrapper[4795]: E0219 21:31:09.510805 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zpkx6" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" Feb 19 21:31:09 crc kubenswrapper[4795]: I0219 21:31:09.525310 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ff4bs" podStartSLOduration=156.525288879 podStartE2EDuration="2m36.525288879s" podCreationTimestamp="2026-02-19 21:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:31:09.507742581 +0000 UTC m=+180.700260465" watchObservedRunningTime="2026-02-19 21:31:09.525288879 +0000 UTC m=+180.717806753" Feb 19 21:31:10 crc kubenswrapper[4795]: I0219 21:31:10.524935 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzmtm" event={"ID":"e8c7f503-32c4-4ca2-8435-9918cae8d931","Type":"ContainerStarted","Data":"e2b3b578e0a130165bd461f501cf99b487cf4f990927f2541dd89aab2055e28d"} Feb 19 21:31:10 crc kubenswrapper[4795]: I0219 21:31:10.532424 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-94s6c" Feb 19 21:31:10 crc kubenswrapper[4795]: I0219 21:31:10.549459 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fq62x" event={"ID":"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7","Type":"ContainerStarted","Data":"13c02d833399591c84abe5da0c16dc2fb6486b3bcc08499be35a9d2502d4f09f"} Feb 19 21:31:10 crc kubenswrapper[4795]: I0219 21:31:10.581051 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zzmtm" podStartSLOduration=2.43949081 podStartE2EDuration="34.581012803s" podCreationTimestamp="2026-02-19 21:30:36 +0000 UTC" firstStartedPulling="2026-02-19 21:30:38.097073263 +0000 UTC m=+149.289591117" lastFinishedPulling="2026-02-19 21:31:10.238595236 +0000 UTC m=+181.431113110" observedRunningTime="2026-02-19 21:31:10.553772082 +0000 UTC m=+181.746289966" watchObservedRunningTime="2026-02-19 21:31:10.581012803 +0000 UTC m=+181.773530697" Feb 19 21:31:10 crc kubenswrapper[4795]: I0219 21:31:10.619777 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fq62x" podStartSLOduration=2.344405468 podStartE2EDuration="34.619743154s" podCreationTimestamp="2026-02-19 21:30:36 +0000 UTC" firstStartedPulling="2026-02-19 21:30:38.05405459 +0000 UTC m=+149.246572454" lastFinishedPulling="2026-02-19 21:31:10.329392276 +0000 UTC m=+181.521910140" observedRunningTime="2026-02-19 21:31:10.596923008 +0000 UTC m=+181.789440902" watchObservedRunningTime="2026-02-19 21:31:10.619743154 +0000 UTC m=+181.812261008" Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.555711 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j7b9" event={"ID":"1fa17669-dc5e-46a8-a76d-befdbc69aeed","Type":"ContainerStarted","Data":"f41b709a11c74f3734b7f736873f5d029cecf42b67def1cf94218b316be19474"} Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.557752 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmzl7" event={"ID":"12e5472a-2c4b-4b71-91fb-06c3d5fcca54","Type":"ContainerStarted","Data":"124253c24d0ba04a222871263e3eb4ccb45c5fd2f1555778cc72b79051abee81"} Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.559449 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v698q" event={"ID":"d858d3ea-6432-49a9-9b32-2e36b61c6e57","Type":"ContainerStarted","Data":"a7485d12d59f4055683cb6ae750e079b0729d5deb16ed6d10099b8d940c03a00"} Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.561212 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c9sh5" event={"ID":"7ae7ca82-f2b1-4fec-9f66-732017519586","Type":"ContainerStarted","Data":"e2d47055ec62ef6c48d9090ad6f5104fd8d79584ae42ab1689cf9d640447aeec"} Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.563927 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tclw" event={"ID":"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9","Type":"ContainerStarted","Data":"7c6c77a2d2c99d511b04824d382fd22558c9e465bd1270597992cd286f73dd2f"} Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.573661 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bmzl7" podStartSLOduration=2.3790651130000002 podStartE2EDuration="33.573648021s" podCreationTimestamp="2026-02-19 21:30:38 +0000 UTC" firstStartedPulling="2026-02-19 21:30:39.116845559 +0000 UTC m=+150.309363423" lastFinishedPulling="2026-02-19 21:31:10.311428437 +0000 UTC m=+181.503946331" observedRunningTime="2026-02-19 21:31:10.621004567 +0000 UTC m=+181.813522471" watchObservedRunningTime="2026-02-19 21:31:11.573648021 +0000 UTC m=+182.766165885" Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.576756 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5j7b9" podStartSLOduration=3.138516995 podStartE2EDuration="35.576749902s" podCreationTimestamp="2026-02-19 21:30:36 +0000 UTC" firstStartedPulling="2026-02-19 21:30:38.099827935 +0000 UTC m=+149.292345809" lastFinishedPulling="2026-02-19 21:31:10.538060842 +0000 UTC m=+181.730578716" observedRunningTime="2026-02-19 21:31:11.572693576 +0000 UTC m=+182.765211440" watchObservedRunningTime="2026-02-19 21:31:11.576749902 +0000 UTC m=+182.769267766" Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.595621 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v698q" podStartSLOduration=3.064592825 podStartE2EDuration="33.595601034s" podCreationTimestamp="2026-02-19 21:30:38 +0000 UTC" firstStartedPulling="2026-02-19 21:30:40.234600812 +0000 UTC m=+151.427118676" lastFinishedPulling="2026-02-19 21:31:10.765609021 +0000 UTC m=+181.958126885" observedRunningTime="2026-02-19 21:31:11.592494643 +0000 UTC m=+182.785012497" watchObservedRunningTime="2026-02-19 21:31:11.595601034 +0000 UTC m=+182.788118898" Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.611843 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c9sh5" podStartSLOduration=3.030864155 podStartE2EDuration="35.611822307s" podCreationTimestamp="2026-02-19 21:30:36 +0000 UTC" firstStartedPulling="2026-02-19 21:30:38.06629394 +0000 UTC m=+149.258811804" lastFinishedPulling="2026-02-19 21:31:10.647252092 +0000 UTC m=+181.839769956" observedRunningTime="2026-02-19 21:31:11.611788096 +0000 UTC m=+182.804305970" watchObservedRunningTime="2026-02-19 21:31:11.611822307 +0000 UTC m=+182.804340171" Feb 19 21:31:11 crc kubenswrapper[4795]: I0219 21:31:11.636745 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5tclw" podStartSLOduration=3.480116909 podStartE2EDuration="32.636724877s" podCreationTimestamp="2026-02-19 21:30:39 +0000 UTC" firstStartedPulling="2026-02-19 21:30:41.265940009 +0000 UTC m=+152.458457873" lastFinishedPulling="2026-02-19 21:31:10.422547977 +0000 UTC m=+181.615065841" observedRunningTime="2026-02-19 21:31:11.635712631 +0000 UTC m=+182.828230505" watchObservedRunningTime="2026-02-19 21:31:11.636724877 +0000 UTC m=+182.829242741" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.381630 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.382239 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.470744 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 21:31:16 crc kubenswrapper[4795]: E0219 21:31:16.470940 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac441c3-3e7c-482d-a324-0c383d0be8ef" containerName="pruner" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.470951 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac441c3-3e7c-482d-a324-0c383d0be8ef" containerName="pruner" Feb 19 21:31:16 crc kubenswrapper[4795]: E0219 21:31:16.470960 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd535c51-1ece-4449-823a-cf80a095eaeb" containerName="pruner" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.470966 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd535c51-1ece-4449-823a-cf80a095eaeb" containerName="pruner" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.471054 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd535c51-1ece-4449-823a-cf80a095eaeb" containerName="pruner" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.471068 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac441c3-3e7c-482d-a324-0c383d0be8ef" containerName="pruner" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.471451 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.475096 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.475431 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.486495 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.503998 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b7f5192-259d-44ff-9e42-5ab977c95519-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9b7f5192-259d-44ff-9e42-5ab977c95519\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.504279 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b7f5192-259d-44ff-9e42-5ab977c95519-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9b7f5192-259d-44ff-9e42-5ab977c95519\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.605688 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b7f5192-259d-44ff-9e42-5ab977c95519-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9b7f5192-259d-44ff-9e42-5ab977c95519\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.605771 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b7f5192-259d-44ff-9e42-5ab977c95519-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9b7f5192-259d-44ff-9e42-5ab977c95519\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.605868 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b7f5192-259d-44ff-9e42-5ab977c95519-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9b7f5192-259d-44ff-9e42-5ab977c95519\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.623981 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b7f5192-259d-44ff-9e42-5ab977c95519-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9b7f5192-259d-44ff-9e42-5ab977c95519\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.757486 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.776473 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.776536 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.778695 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.787868 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.829510 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:31:16 crc kubenswrapper[4795]: I0219 21:31:16.830515 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.020483 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.020813 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.064798 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.170657 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.170771 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.209344 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.227034 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 21:31:17 crc kubenswrapper[4795]: W0219 21:31:17.233186 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9b7f5192_259d_44ff_9e42_5ab977c95519.slice/crio-080d0dae8e8a181a5f82b3abfaa74c44eff0ab9f2582c06c29aee612f9a99152 WatchSource:0}: Error finding container 080d0dae8e8a181a5f82b3abfaa74c44eff0ab9f2582c06c29aee612f9a99152: Status 404 returned error can't find the container with id 080d0dae8e8a181a5f82b3abfaa74c44eff0ab9f2582c06c29aee612f9a99152 Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.598523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9b7f5192-259d-44ff-9e42-5ab977c95519","Type":"ContainerStarted","Data":"080d0dae8e8a181a5f82b3abfaa74c44eff0ab9f2582c06c29aee612f9a99152"} Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.640875 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.642894 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:31:17 crc kubenswrapper[4795]: I0219 21:31:17.643334 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.427398 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fq62x"] Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.557974 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.558041 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.597417 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.603732 4795 generic.go:334] "Generic (PLEG): container finished" podID="9b7f5192-259d-44ff-9e42-5ab977c95519" containerID="f9d8e5687bb8b8d8eea28ea3abe15bf40dbdf4c746016f8ddecd8da2c5b538dd" exitCode=0 Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.603821 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9b7f5192-259d-44ff-9e42-5ab977c95519","Type":"ContainerDied","Data":"f9d8e5687bb8b8d8eea28ea3abe15bf40dbdf4c746016f8ddecd8da2c5b538dd"} Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.649599 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.920537 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pb7s7"] Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.956086 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:31:18 crc kubenswrapper[4795]: I0219 21:31:18.956154 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.000784 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.428554 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5j7b9"] Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.614749 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fq62x" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerName="registry-server" containerID="cri-o://13c02d833399591c84abe5da0c16dc2fb6486b3bcc08499be35a9d2502d4f09f" gracePeriod=2 Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.665351 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.758624 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.758673 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.823970 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.933064 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.945873 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b7f5192-259d-44ff-9e42-5ab977c95519-kubelet-dir\") pod \"9b7f5192-259d-44ff-9e42-5ab977c95519\" (UID: \"9b7f5192-259d-44ff-9e42-5ab977c95519\") " Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.945924 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b7f5192-259d-44ff-9e42-5ab977c95519-kube-api-access\") pod \"9b7f5192-259d-44ff-9e42-5ab977c95519\" (UID: \"9b7f5192-259d-44ff-9e42-5ab977c95519\") " Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.949280 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b7f5192-259d-44ff-9e42-5ab977c95519-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9b7f5192-259d-44ff-9e42-5ab977c95519" (UID: "9b7f5192-259d-44ff-9e42-5ab977c95519"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:31:19 crc kubenswrapper[4795]: I0219 21:31:19.951337 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7f5192-259d-44ff-9e42-5ab977c95519-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9b7f5192-259d-44ff-9e42-5ab977c95519" (UID: "9b7f5192-259d-44ff-9e42-5ab977c95519"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.046887 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9b7f5192-259d-44ff-9e42-5ab977c95519-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.046916 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b7f5192-259d-44ff-9e42-5ab977c95519-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.623276 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.623276 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"9b7f5192-259d-44ff-9e42-5ab977c95519","Type":"ContainerDied","Data":"080d0dae8e8a181a5f82b3abfaa74c44eff0ab9f2582c06c29aee612f9a99152"} Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.623735 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="080d0dae8e8a181a5f82b3abfaa74c44eff0ab9f2582c06c29aee612f9a99152" Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.625236 4795 generic.go:334] "Generic (PLEG): container finished" podID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerID="13c02d833399591c84abe5da0c16dc2fb6486b3bcc08499be35a9d2502d4f09f" exitCode=0 Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.625393 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fq62x" event={"ID":"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7","Type":"ContainerDied","Data":"13c02d833399591c84abe5da0c16dc2fb6486b3bcc08499be35a9d2502d4f09f"} Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.626230 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5j7b9" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerName="registry-server" containerID="cri-o://f41b709a11c74f3734b7f736873f5d029cecf42b67def1cf94218b316be19474" gracePeriod=2 Feb 19 21:31:20 crc kubenswrapper[4795]: I0219 21:31:20.661733 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.122906 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.260240 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr9vx\" (UniqueName: \"kubernetes.io/projected/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-kube-api-access-jr9vx\") pod \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.260347 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-utilities\") pod \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.260373 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-catalog-content\") pod \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\" (UID: \"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7\") " Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.261096 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-utilities" (OuterVolumeSpecName: "utilities") pod "8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" (UID: "8e47d73b-026b-4bc2-b1f0-c69efadc0ce7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.265963 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-kube-api-access-jr9vx" (OuterVolumeSpecName: "kube-api-access-jr9vx") pod "8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" (UID: "8e47d73b-026b-4bc2-b1f0-c69efadc0ce7"). InnerVolumeSpecName "kube-api-access-jr9vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.308651 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" (UID: "8e47d73b-026b-4bc2-b1f0-c69efadc0ce7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.361240 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr9vx\" (UniqueName: \"kubernetes.io/projected/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-kube-api-access-jr9vx\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.361275 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.361288 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.633321 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fq62x" event={"ID":"8e47d73b-026b-4bc2-b1f0-c69efadc0ce7","Type":"ContainerDied","Data":"56f8a93ddff4618883796150de8b693b1c3a76f9b5f00a99b738a48400fed9ee"} Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.633725 4795 scope.go:117] "RemoveContainer" containerID="13c02d833399591c84abe5da0c16dc2fb6486b3bcc08499be35a9d2502d4f09f" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.633372 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fq62x" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.635718 4795 generic.go:334] "Generic (PLEG): container finished" podID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerID="f41b709a11c74f3734b7f736873f5d029cecf42b67def1cf94218b316be19474" exitCode=0 Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.635979 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j7b9" event={"ID":"1fa17669-dc5e-46a8-a76d-befdbc69aeed","Type":"ContainerDied","Data":"f41b709a11c74f3734b7f736873f5d029cecf42b67def1cf94218b316be19474"} Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.678868 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.687714 4795 scope.go:117] "RemoveContainer" containerID="b730d866cb5930ce6b3bd30068b74b30f568d9da17e9b40a916b2db10b9b7380" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.690439 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fq62x"] Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.692904 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fq62x"] Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.706958 4795 scope.go:117] "RemoveContainer" containerID="7884e76708116ab40efc68a93802f7cd42d9acd2ddce9816192d25b3558b0941" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.825380 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v698q"] Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.825778 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v698q" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerName="registry-server" containerID="cri-o://a7485d12d59f4055683cb6ae750e079b0729d5deb16ed6d10099b8d940c03a00" gracePeriod=2 Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.866386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-catalog-content\") pod \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.866432 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9dxw\" (UniqueName: \"kubernetes.io/projected/1fa17669-dc5e-46a8-a76d-befdbc69aeed-kube-api-access-l9dxw\") pod \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.866506 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-utilities\") pod \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\" (UID: \"1fa17669-dc5e-46a8-a76d-befdbc69aeed\") " Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.867344 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-utilities" (OuterVolumeSpecName: "utilities") pod "1fa17669-dc5e-46a8-a76d-befdbc69aeed" (UID: "1fa17669-dc5e-46a8-a76d-befdbc69aeed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.870443 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa17669-dc5e-46a8-a76d-befdbc69aeed-kube-api-access-l9dxw" (OuterVolumeSpecName: "kube-api-access-l9dxw") pod "1fa17669-dc5e-46a8-a76d-befdbc69aeed" (UID: "1fa17669-dc5e-46a8-a76d-befdbc69aeed"). InnerVolumeSpecName "kube-api-access-l9dxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.967906 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9dxw\" (UniqueName: \"kubernetes.io/projected/1fa17669-dc5e-46a8-a76d-befdbc69aeed-kube-api-access-l9dxw\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:21 crc kubenswrapper[4795]: I0219 21:31:21.967938 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.166910 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fa17669-dc5e-46a8-a76d-befdbc69aeed" (UID: "1fa17669-dc5e-46a8-a76d-befdbc69aeed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.169426 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fa17669-dc5e-46a8-a76d-befdbc69aeed-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.643815 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j7b9" event={"ID":"1fa17669-dc5e-46a8-a76d-befdbc69aeed","Type":"ContainerDied","Data":"825dc9912efe54135ca23a728bf83c290bc47d9c0fb6c14bb961a4a89a196a6f"} Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.643871 4795 scope.go:117] "RemoveContainer" containerID="f41b709a11c74f3734b7f736873f5d029cecf42b67def1cf94218b316be19474" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.643983 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5j7b9" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.653287 4795 generic.go:334] "Generic (PLEG): container finished" podID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerID="a7485d12d59f4055683cb6ae750e079b0729d5deb16ed6d10099b8d940c03a00" exitCode=0 Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.653327 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v698q" event={"ID":"d858d3ea-6432-49a9-9b32-2e36b61c6e57","Type":"ContainerDied","Data":"a7485d12d59f4055683cb6ae750e079b0729d5deb16ed6d10099b8d940c03a00"} Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.668872 4795 scope.go:117] "RemoveContainer" containerID="90b9312e0f5408fc251999d131140fd68f045334ecdf8f5a7e0995574996fb05" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.680550 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5j7b9"] Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.682874 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5j7b9"] Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.710398 4795 scope.go:117] "RemoveContainer" containerID="ed0bfebb52fd6133757aa14996934704806932b85d024110638b142b237490a2" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.910588 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.982348 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-catalog-content\") pod \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.982422 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnffn\" (UniqueName: \"kubernetes.io/projected/d858d3ea-6432-49a9-9b32-2e36b61c6e57-kube-api-access-hnffn\") pod \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.982455 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-utilities\") pod \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\" (UID: \"d858d3ea-6432-49a9-9b32-2e36b61c6e57\") " Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.983388 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-utilities" (OuterVolumeSpecName: "utilities") pod "d858d3ea-6432-49a9-9b32-2e36b61c6e57" (UID: "d858d3ea-6432-49a9-9b32-2e36b61c6e57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:31:22 crc kubenswrapper[4795]: I0219 21:31:22.986896 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d858d3ea-6432-49a9-9b32-2e36b61c6e57-kube-api-access-hnffn" (OuterVolumeSpecName: "kube-api-access-hnffn") pod "d858d3ea-6432-49a9-9b32-2e36b61c6e57" (UID: "d858d3ea-6432-49a9-9b32-2e36b61c6e57"). InnerVolumeSpecName "kube-api-access-hnffn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.004507 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d858d3ea-6432-49a9-9b32-2e36b61c6e57" (UID: "d858d3ea-6432-49a9-9b32-2e36b61c6e57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.083307 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnffn\" (UniqueName: \"kubernetes.io/projected/d858d3ea-6432-49a9-9b32-2e36b61c6e57-kube-api-access-hnffn\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.083676 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.083688 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d858d3ea-6432-49a9-9b32-2e36b61c6e57-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.517830 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" path="/var/lib/kubelet/pods/1fa17669-dc5e-46a8-a76d-befdbc69aeed/volumes" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.518421 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" path="/var/lib/kubelet/pods/8e47d73b-026b-4bc2-b1f0-c69efadc0ce7/volumes" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.673200 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v698q" event={"ID":"d858d3ea-6432-49a9-9b32-2e36b61c6e57","Type":"ContainerDied","Data":"28460660f6d692ba6e8f6b91806cd443070cee612e5427f6f9dabd127b7a144e"} Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.673238 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v698q" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.673260 4795 scope.go:117] "RemoveContainer" containerID="a7485d12d59f4055683cb6ae750e079b0729d5deb16ed6d10099b8d940c03a00" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.689056 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v698q"] Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.689490 4795 scope.go:117] "RemoveContainer" containerID="76fd4bbcb0278e373cff16d83c7ceda3fc1a11cdd6076ee88c76e62b714b1bad" Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.692391 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v698q"] Feb 19 21:31:23 crc kubenswrapper[4795]: I0219 21:31:23.720484 4795 scope.go:117] "RemoveContainer" containerID="3f2cfda31513c4d8d601e01af9b97c3ba687e6c071cb1cb6f9eb5d2af4229073" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267541 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267728 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerName="extract-content" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267739 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerName="extract-content" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267748 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267755 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267766 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267772 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267781 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerName="extract-utilities" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267787 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerName="extract-utilities" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267794 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerName="extract-utilities" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267800 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerName="extract-utilities" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267812 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7f5192-259d-44ff-9e42-5ab977c95519" containerName="pruner" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267819 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7f5192-259d-44ff-9e42-5ab977c95519" containerName="pruner" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267831 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerName="extract-utilities" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267838 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerName="extract-utilities" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267848 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerName="extract-content" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267855 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerName="extract-content" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267867 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267874 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: E0219 21:31:24.267884 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerName="extract-content" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.267891 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerName="extract-content" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.268009 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7f5192-259d-44ff-9e42-5ab977c95519" containerName="pruner" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.268021 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa17669-dc5e-46a8-a76d-befdbc69aeed" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.268035 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e47d73b-026b-4bc2-b1f0-c69efadc0ce7" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.268045 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" containerName="registry-server" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.268426 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.270091 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.270346 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.278596 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.397463 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kube-api-access\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.397514 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.397549 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-var-lock\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.498609 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kube-api-access\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.498659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.498701 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-var-lock\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.498801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.498817 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-var-lock\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.516322 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kube-api-access\") pod \"installer-9-crc\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:24 crc kubenswrapper[4795]: I0219 21:31:24.598554 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:31:25 crc kubenswrapper[4795]: I0219 21:31:25.021657 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 21:31:25 crc kubenswrapper[4795]: W0219 21:31:25.031350 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6b9bcd07_a45a_4f32_9d56_9ebb0931b1a6.slice/crio-6561cdc38385fc6e6e9ed635e289b887be8b67167e1892ff8512a6be8137ce3f WatchSource:0}: Error finding container 6561cdc38385fc6e6e9ed635e289b887be8b67167e1892ff8512a6be8137ce3f: Status 404 returned error can't find the container with id 6561cdc38385fc6e6e9ed635e289b887be8b67167e1892ff8512a6be8137ce3f Feb 19 21:31:25 crc kubenswrapper[4795]: I0219 21:31:25.518315 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d858d3ea-6432-49a9-9b32-2e36b61c6e57" path="/var/lib/kubelet/pods/d858d3ea-6432-49a9-9b32-2e36b61c6e57/volumes" Feb 19 21:31:25 crc kubenswrapper[4795]: I0219 21:31:25.686651 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpkx6" event={"ID":"ac83daf6-848e-4977-8bb9-a7b4db89618f","Type":"ContainerStarted","Data":"43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572"} Feb 19 21:31:25 crc kubenswrapper[4795]: I0219 21:31:25.688400 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6","Type":"ContainerStarted","Data":"d1854bd6a757d0896d79bed3a255459850433d4bfa0a2f7e67645e00599c61cf"} Feb 19 21:31:25 crc kubenswrapper[4795]: I0219 21:31:25.688461 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6","Type":"ContainerStarted","Data":"6561cdc38385fc6e6e9ed635e289b887be8b67167e1892ff8512a6be8137ce3f"} Feb 19 21:31:25 crc kubenswrapper[4795]: I0219 21:31:25.736770 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.736753427 podStartE2EDuration="1.736753427s" podCreationTimestamp="2026-02-19 21:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:31:25.733575359 +0000 UTC m=+196.926093243" watchObservedRunningTime="2026-02-19 21:31:25.736753427 +0000 UTC m=+196.929271291" Feb 19 21:31:26 crc kubenswrapper[4795]: I0219 21:31:26.694801 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerID="43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572" exitCode=0 Feb 19 21:31:26 crc kubenswrapper[4795]: I0219 21:31:26.694863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpkx6" event={"ID":"ac83daf6-848e-4977-8bb9-a7b4db89618f","Type":"ContainerDied","Data":"43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572"} Feb 19 21:31:28 crc kubenswrapper[4795]: I0219 21:31:28.428071 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:31:28 crc kubenswrapper[4795]: I0219 21:31:28.428504 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:31:30 crc kubenswrapper[4795]: I0219 21:31:30.726710 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpkx6" event={"ID":"ac83daf6-848e-4977-8bb9-a7b4db89618f","Type":"ContainerStarted","Data":"f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f"} Feb 19 21:31:40 crc kubenswrapper[4795]: I0219 21:31:40.237877 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:31:40 crc kubenswrapper[4795]: I0219 21:31:40.238730 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:31:40 crc kubenswrapper[4795]: I0219 21:31:40.313525 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:31:40 crc kubenswrapper[4795]: I0219 21:31:40.338804 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zpkx6" podStartSLOduration=12.933403007999999 podStartE2EDuration="1m1.338789275s" podCreationTimestamp="2026-02-19 21:30:39 +0000 UTC" firstStartedPulling="2026-02-19 21:30:41.224110738 +0000 UTC m=+152.416628602" lastFinishedPulling="2026-02-19 21:31:29.629497005 +0000 UTC m=+200.822014869" observedRunningTime="2026-02-19 21:31:30.746277788 +0000 UTC m=+201.938795672" watchObservedRunningTime="2026-02-19 21:31:40.338789275 +0000 UTC m=+211.531307129" Feb 19 21:31:40 crc kubenswrapper[4795]: I0219 21:31:40.847975 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:31:40 crc kubenswrapper[4795]: I0219 21:31:40.912464 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zpkx6"] Feb 19 21:31:42 crc kubenswrapper[4795]: I0219 21:31:42.785966 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zpkx6" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerName="registry-server" containerID="cri-o://f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f" gracePeriod=2 Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.162563 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.243569 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-utilities\") pod \"ac83daf6-848e-4977-8bb9-a7b4db89618f\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.243799 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jst7w\" (UniqueName: \"kubernetes.io/projected/ac83daf6-848e-4977-8bb9-a7b4db89618f-kube-api-access-jst7w\") pod \"ac83daf6-848e-4977-8bb9-a7b4db89618f\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.243851 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-catalog-content\") pod \"ac83daf6-848e-4977-8bb9-a7b4db89618f\" (UID: \"ac83daf6-848e-4977-8bb9-a7b4db89618f\") " Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.244486 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-utilities" (OuterVolumeSpecName: "utilities") pod "ac83daf6-848e-4977-8bb9-a7b4db89618f" (UID: "ac83daf6-848e-4977-8bb9-a7b4db89618f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.252566 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac83daf6-848e-4977-8bb9-a7b4db89618f-kube-api-access-jst7w" (OuterVolumeSpecName: "kube-api-access-jst7w") pod "ac83daf6-848e-4977-8bb9-a7b4db89618f" (UID: "ac83daf6-848e-4977-8bb9-a7b4db89618f"). InnerVolumeSpecName "kube-api-access-jst7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.346249 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.346321 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jst7w\" (UniqueName: \"kubernetes.io/projected/ac83daf6-848e-4977-8bb9-a7b4db89618f-kube-api-access-jst7w\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.405989 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac83daf6-848e-4977-8bb9-a7b4db89618f" (UID: "ac83daf6-848e-4977-8bb9-a7b4db89618f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.448091 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac83daf6-848e-4977-8bb9-a7b4db89618f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.794542 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerID="f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f" exitCode=0 Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.794593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpkx6" event={"ID":"ac83daf6-848e-4977-8bb9-a7b4db89618f","Type":"ContainerDied","Data":"f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f"} Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.794628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zpkx6" event={"ID":"ac83daf6-848e-4977-8bb9-a7b4db89618f","Type":"ContainerDied","Data":"5f603a895c6c8cc9d233f8386c8619bd99f45021b342fc0088eeb4edeed70ce6"} Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.794655 4795 scope.go:117] "RemoveContainer" containerID="f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.794799 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zpkx6" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.826258 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zpkx6"] Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.829023 4795 scope.go:117] "RemoveContainer" containerID="43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.842693 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zpkx6"] Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.856221 4795 scope.go:117] "RemoveContainer" containerID="b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.881365 4795 scope.go:117] "RemoveContainer" containerID="f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f" Feb 19 21:31:43 crc kubenswrapper[4795]: E0219 21:31:43.882057 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f\": container with ID starting with f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f not found: ID does not exist" containerID="f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.882109 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f"} err="failed to get container status \"f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f\": rpc error: code = NotFound desc = could not find container \"f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f\": container with ID starting with f64c952287ce3317126edd7ba098c5fd097db5032a96dca6b373fe4f29a5389f not found: ID does not exist" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.882200 4795 scope.go:117] "RemoveContainer" containerID="43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572" Feb 19 21:31:43 crc kubenswrapper[4795]: E0219 21:31:43.882656 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572\": container with ID starting with 43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572 not found: ID does not exist" containerID="43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.882704 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572"} err="failed to get container status \"43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572\": rpc error: code = NotFound desc = could not find container \"43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572\": container with ID starting with 43f429377ebc8b5c3ab07e6e96409113b8aa9376fff7b173c7135fc5c7df0572 not found: ID does not exist" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.882731 4795 scope.go:117] "RemoveContainer" containerID="b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd" Feb 19 21:31:43 crc kubenswrapper[4795]: E0219 21:31:43.883051 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd\": container with ID starting with b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd not found: ID does not exist" containerID="b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.883080 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd"} err="failed to get container status \"b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd\": rpc error: code = NotFound desc = could not find container \"b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd\": container with ID starting with b9d88a868f45e4d7fdf10f8b6b1fdb6c04b5f7137907622ee25cca321d0d04fd not found: ID does not exist" Feb 19 21:31:43 crc kubenswrapper[4795]: I0219 21:31:43.950726 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" podUID="9de314c5-1440-476b-b98b-7804f5d95145" containerName="oauth-openshift" containerID="cri-o://4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200" gracePeriod=15 Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.358643 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.462997 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-service-ca\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463043 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxfbg\" (UniqueName: \"kubernetes.io/projected/9de314c5-1440-476b-b98b-7804f5d95145-kube-api-access-cxfbg\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463112 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-login\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463135 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9de314c5-1440-476b-b98b-7804f5d95145-audit-dir\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463157 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-trusted-ca-bundle\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-idp-0-file-data\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463226 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-cliconfig\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463242 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-audit-policies\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463262 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-router-certs\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463284 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-ocp-branding-template\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463407 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-error\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463508 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-session\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463550 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-provider-selection\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463846 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.463980 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9de314c5-1440-476b-b98b-7804f5d95145-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.464149 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-serving-cert\") pod \"9de314c5-1440-476b-b98b-7804f5d95145\" (UID: \"9de314c5-1440-476b-b98b-7804f5d95145\") " Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.464299 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.464314 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.464579 4795 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9de314c5-1440-476b-b98b-7804f5d95145-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.464597 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.464611 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.464623 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.465575 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.468456 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.468911 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.469429 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.470051 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.470063 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.470313 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9de314c5-1440-476b-b98b-7804f5d95145-kube-api-access-cxfbg" (OuterVolumeSpecName: "kube-api-access-cxfbg") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "kube-api-access-cxfbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.470327 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.475507 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.480432 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "9de314c5-1440-476b-b98b-7804f5d95145" (UID: "9de314c5-1440-476b-b98b-7804f5d95145"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565820 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxfbg\" (UniqueName: \"kubernetes.io/projected/9de314c5-1440-476b-b98b-7804f5d95145-kube-api-access-cxfbg\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565885 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565898 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565914 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565925 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565935 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565945 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565954 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565964 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.565976 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9de314c5-1440-476b-b98b-7804f5d95145-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.805338 4795 generic.go:334] "Generic (PLEG): container finished" podID="9de314c5-1440-476b-b98b-7804f5d95145" containerID="4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200" exitCode=0 Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.805438 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" event={"ID":"9de314c5-1440-476b-b98b-7804f5d95145","Type":"ContainerDied","Data":"4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200"} Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.805878 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" event={"ID":"9de314c5-1440-476b-b98b-7804f5d95145","Type":"ContainerDied","Data":"704e722c476db821d8e6f00d8c80db7e6888aef51f3367193fd7b4f2cac02bc3"} Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.805498 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pb7s7" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.805907 4795 scope.go:117] "RemoveContainer" containerID="4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.836720 4795 scope.go:117] "RemoveContainer" containerID="4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200" Feb 19 21:31:44 crc kubenswrapper[4795]: E0219 21:31:44.837148 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200\": container with ID starting with 4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200 not found: ID does not exist" containerID="4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.837202 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200"} err="failed to get container status \"4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200\": rpc error: code = NotFound desc = could not find container \"4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200\": container with ID starting with 4877e8c5ce5d915ffce78d7fcaa41becc17c2fa95a99e16fe8f7fd820d5bc200 not found: ID does not exist" Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.846301 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pb7s7"] Feb 19 21:31:44 crc kubenswrapper[4795]: I0219 21:31:44.849340 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pb7s7"] Feb 19 21:31:45 crc kubenswrapper[4795]: I0219 21:31:45.524205 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9de314c5-1440-476b-b98b-7804f5d95145" path="/var/lib/kubelet/pods/9de314c5-1440-476b-b98b-7804f5d95145/volumes" Feb 19 21:31:45 crc kubenswrapper[4795]: I0219 21:31:45.525232 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" path="/var/lib/kubelet/pods/ac83daf6-848e-4977-8bb9-a7b4db89618f/volumes" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.528827 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5"] Feb 19 21:31:53 crc kubenswrapper[4795]: E0219 21:31:53.530638 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9de314c5-1440-476b-b98b-7804f5d95145" containerName="oauth-openshift" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.530709 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de314c5-1440-476b-b98b-7804f5d95145" containerName="oauth-openshift" Feb 19 21:31:53 crc kubenswrapper[4795]: E0219 21:31:53.530769 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerName="registry-server" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.530831 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerName="registry-server" Feb 19 21:31:53 crc kubenswrapper[4795]: E0219 21:31:53.530897 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerName="extract-utilities" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.530948 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerName="extract-utilities" Feb 19 21:31:53 crc kubenswrapper[4795]: E0219 21:31:53.531008 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerName="extract-content" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.531063 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerName="extract-content" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.531230 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9de314c5-1440-476b-b98b-7804f5d95145" containerName="oauth-openshift" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.531295 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac83daf6-848e-4977-8bb9-a7b4db89618f" containerName="registry-server" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.531733 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.534012 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.534151 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.534324 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.535516 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.535706 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.535823 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.535907 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.536813 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.537814 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.537943 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.538081 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.538237 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.546547 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.548716 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.554152 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5"] Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.558016 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.689767 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-service-ca\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.689808 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f54db4e-f039-4a8c-84c6-82b502e3c925-audit-dir\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.689852 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.689891 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.689910 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-session\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.689942 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-login\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.689965 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.689983 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.690002 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-audit-policies\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.690019 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.690203 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-error\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.690271 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-router-certs\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.690358 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd82l\" (UniqueName: \"kubernetes.io/projected/4f54db4e-f039-4a8c-84c6-82b502e3c925-kube-api-access-fd82l\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.690445 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.791583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.791827 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-session\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.791935 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-login\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.792027 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.792120 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.792332 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-audit-policies\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.792686 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.792802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-error\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.793248 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-router-certs\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.793350 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd82l\" (UniqueName: \"kubernetes.io/projected/4f54db4e-f039-4a8c-84c6-82b502e3c925-kube-api-access-fd82l\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.793447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.793594 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-service-ca\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.793670 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.793775 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f54db4e-f039-4a8c-84c6-82b502e3c925-audit-dir\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.793902 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f54db4e-f039-4a8c-84c6-82b502e3c925-audit-dir\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.792909 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-audit-policies\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.792703 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.794583 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.794769 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-service-ca\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.797510 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.797510 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.805612 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-error\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.805800 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-router-certs\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.806641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.810447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-user-template-login\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.810826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-session\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.810859 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4f54db4e-f039-4a8c-84c6-82b502e3c925-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.818337 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd82l\" (UniqueName: \"kubernetes.io/projected/4f54db4e-f039-4a8c-84c6-82b502e3c925-kube-api-access-fd82l\") pod \"oauth-openshift-57bcd9fbb-h7mr5\" (UID: \"4f54db4e-f039-4a8c-84c6-82b502e3c925\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:53 crc kubenswrapper[4795]: I0219 21:31:53.856385 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:54 crc kubenswrapper[4795]: I0219 21:31:54.214388 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5"] Feb 19 21:31:54 crc kubenswrapper[4795]: I0219 21:31:54.879058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" event={"ID":"4f54db4e-f039-4a8c-84c6-82b502e3c925","Type":"ContainerStarted","Data":"4c202565e552b9755548ccb2c23f10760de1e618afd946344515df9bc31e4ad5"} Feb 19 21:31:54 crc kubenswrapper[4795]: I0219 21:31:54.880857 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" event={"ID":"4f54db4e-f039-4a8c-84c6-82b502e3c925","Type":"ContainerStarted","Data":"a23a01be45afae3d3855a5117f260f203acfb7581d38469c4afc270a96d7c809"} Feb 19 21:31:54 crc kubenswrapper[4795]: I0219 21:31:54.880969 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:54 crc kubenswrapper[4795]: I0219 21:31:54.885239 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" Feb 19 21:31:54 crc kubenswrapper[4795]: I0219 21:31:54.901019 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-57bcd9fbb-h7mr5" podStartSLOduration=36.900999391 podStartE2EDuration="36.900999391s" podCreationTimestamp="2026-02-19 21:31:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:31:54.897103654 +0000 UTC m=+226.089621528" watchObservedRunningTime="2026-02-19 21:31:54.900999391 +0000 UTC m=+226.093517255" Feb 19 21:31:58 crc kubenswrapper[4795]: I0219 21:31:58.428087 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:31:58 crc kubenswrapper[4795]: I0219 21:31:58.428495 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:31:58 crc kubenswrapper[4795]: I0219 21:31:58.428562 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:31:58 crc kubenswrapper[4795]: I0219 21:31:58.429415 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:31:58 crc kubenswrapper[4795]: I0219 21:31:58.429532 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e" gracePeriod=600 Feb 19 21:31:58 crc kubenswrapper[4795]: I0219 21:31:58.904020 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e" exitCode=0 Feb 19 21:31:58 crc kubenswrapper[4795]: I0219 21:31:58.904090 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e"} Feb 19 21:31:58 crc kubenswrapper[4795]: I0219 21:31:58.904448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"b70534b9f8ebcb8ef865c023146a47e5407cdcbaee4d6cb7a41e8ac0daedef4a"} Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.946451 4795 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.947281 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce" gracePeriod=15 Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.947355 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b" gracePeriod=15 Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.947383 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c" gracePeriod=15 Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.947441 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0" gracePeriod=15 Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.947379 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30" gracePeriod=15 Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.950776 4795 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 21:32:02 crc kubenswrapper[4795]: E0219 21:32:02.951546 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.951587 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 21:32:02 crc kubenswrapper[4795]: E0219 21:32:02.951612 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.951626 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 21:32:02 crc kubenswrapper[4795]: E0219 21:32:02.951650 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.951663 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 21:32:02 crc kubenswrapper[4795]: E0219 21:32:02.951678 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.951690 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 21:32:02 crc kubenswrapper[4795]: E0219 21:32:02.951705 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.951717 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 21:32:02 crc kubenswrapper[4795]: E0219 21:32:02.951743 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.951755 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.951962 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.951991 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.952015 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.952040 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.952063 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.952083 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 21:32:02 crc kubenswrapper[4795]: E0219 21:32:02.952303 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.952319 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.958357 4795 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.959905 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:02 crc kubenswrapper[4795]: I0219 21:32:02.964725 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 19 21:32:03 crc kubenswrapper[4795]: E0219 21:32:03.000103 4795 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.102465 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.102517 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.102542 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.102577 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.102684 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.102723 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.102755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.102818 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203653 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203728 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203750 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203790 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203817 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203833 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203777 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203858 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203909 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203964 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.203959 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.204031 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.204127 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.204162 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.301632 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: E0219 21:32:03.325591 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.69:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895c3438c5182ea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 21:32:03.32453553 +0000 UTC m=+234.517053404,LastTimestamp:2026-02-19 21:32:03.32453553 +0000 UTC m=+234.517053404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.931221 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6"} Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.931643 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2bc8077dbe1f6e57ad6e4ac46bb06b6c63e727e2b8059b59cfdf30ce4e8238ff"} Feb 19 21:32:03 crc kubenswrapper[4795]: E0219 21:32:03.932528 4795 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.933386 4795 generic.go:334] "Generic (PLEG): container finished" podID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" containerID="d1854bd6a757d0896d79bed3a255459850433d4bfa0a2f7e67645e00599c61cf" exitCode=0 Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.933479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6","Type":"ContainerDied","Data":"d1854bd6a757d0896d79bed3a255459850433d4bfa0a2f7e67645e00599c61cf"} Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.934566 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.935572 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.936690 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.937372 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30" exitCode=0 Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.937403 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0" exitCode=0 Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.937416 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b" exitCode=0 Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.937429 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c" exitCode=2 Feb 19 21:32:03 crc kubenswrapper[4795]: I0219 21:32:03.937486 4795 scope.go:117] "RemoveContainer" containerID="d9c22dc6cf9d149a650e79644d06e8c5720e8f041207879526a9bc75ffcca650" Feb 19 21:32:04 crc kubenswrapper[4795]: I0219 21:32:04.950528 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.295382 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.296722 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.301095 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.301884 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.302293 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.302745 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431239 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431308 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kube-api-access\") pod \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431329 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-var-lock\") pod \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431374 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431393 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kubelet-dir\") pod \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\" (UID: \"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6\") " Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431443 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" (UID: "6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431509 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-var-lock" (OuterVolumeSpecName: "var-lock") pod "6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" (UID: "6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431543 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431521 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431575 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431743 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.431998 4795 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.432024 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.432040 4795 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.432054 4795 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.432066 4795 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.437861 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" (UID: "6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.519316 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.532756 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.961220 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.961223 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6","Type":"ContainerDied","Data":"6561cdc38385fc6e6e9ed635e289b887be8b67167e1892ff8512a6be8137ce3f"} Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.961264 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6561cdc38385fc6e6e9ed635e289b887be8b67167e1892ff8512a6be8137ce3f" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.965330 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.966283 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.967076 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce" exitCode=0 Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.967129 4795 scope.go:117] "RemoveContainer" containerID="818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.967260 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.968869 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.969320 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.970233 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.970764 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:05 crc kubenswrapper[4795]: I0219 21:32:05.993918 4795 scope.go:117] "RemoveContainer" containerID="353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.019362 4795 scope.go:117] "RemoveContainer" containerID="21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.037641 4795 scope.go:117] "RemoveContainer" containerID="7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.066059 4795 scope.go:117] "RemoveContainer" containerID="ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.092772 4795 scope.go:117] "RemoveContainer" containerID="6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.130672 4795 scope.go:117] "RemoveContainer" containerID="818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30" Feb 19 21:32:06 crc kubenswrapper[4795]: E0219 21:32:06.131416 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\": container with ID starting with 818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30 not found: ID does not exist" containerID="818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.131468 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30"} err="failed to get container status \"818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\": rpc error: code = NotFound desc = could not find container \"818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30\": container with ID starting with 818124d688299297f3fd8dbc07f9ff49619a306d92f27737b45cc01152c84d30 not found: ID does not exist" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.131506 4795 scope.go:117] "RemoveContainer" containerID="353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0" Feb 19 21:32:06 crc kubenswrapper[4795]: E0219 21:32:06.132394 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\": container with ID starting with 353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0 not found: ID does not exist" containerID="353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.132429 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0"} err="failed to get container status \"353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\": rpc error: code = NotFound desc = could not find container \"353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0\": container with ID starting with 353bdcca8a18af740e2792d2b82f312d3a33c9f10f0b99e09109ea894bee00b0 not found: ID does not exist" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.132449 4795 scope.go:117] "RemoveContainer" containerID="21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b" Feb 19 21:32:06 crc kubenswrapper[4795]: E0219 21:32:06.133730 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\": container with ID starting with 21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b not found: ID does not exist" containerID="21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.133763 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b"} err="failed to get container status \"21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\": rpc error: code = NotFound desc = could not find container \"21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b\": container with ID starting with 21dcb4760146b61a0c2d4644af8e234ef5448ec6910b496482e7aee551e09a2b not found: ID does not exist" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.133782 4795 scope.go:117] "RemoveContainer" containerID="7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c" Feb 19 21:32:06 crc kubenswrapper[4795]: E0219 21:32:06.134568 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\": container with ID starting with 7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c not found: ID does not exist" containerID="7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.134595 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c"} err="failed to get container status \"7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\": rpc error: code = NotFound desc = could not find container \"7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c\": container with ID starting with 7c11580adb26ad26c1f4627bf53b8d87f2ae4504c7f9029316f2bdda051eca7c not found: ID does not exist" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.134613 4795 scope.go:117] "RemoveContainer" containerID="ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce" Feb 19 21:32:06 crc kubenswrapper[4795]: E0219 21:32:06.134998 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\": container with ID starting with ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce not found: ID does not exist" containerID="ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.135031 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce"} err="failed to get container status \"ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\": rpc error: code = NotFound desc = could not find container \"ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce\": container with ID starting with ac9391ff6fac7bcc025cb8d4b0d9cda38f69ff28cb011ab9813d7f9718d93dce not found: ID does not exist" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.135056 4795 scope.go:117] "RemoveContainer" containerID="6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a" Feb 19 21:32:06 crc kubenswrapper[4795]: E0219 21:32:06.135811 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\": container with ID starting with 6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a not found: ID does not exist" containerID="6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a" Feb 19 21:32:06 crc kubenswrapper[4795]: I0219 21:32:06.135886 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a"} err="failed to get container status \"6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\": rpc error: code = NotFound desc = could not find container \"6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a\": container with ID starting with 6859a03cc20d4dc289bbdb27ffff2924e901759e6a393b48d343175e1de7748a not found: ID does not exist" Feb 19 21:32:06 crc kubenswrapper[4795]: E0219 21:32:06.990716 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.69:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895c3438c5182ea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 21:32:03.32453553 +0000 UTC m=+234.517053404,LastTimestamp:2026-02-19 21:32:03.32453553 +0000 UTC m=+234.517053404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 21:32:08 crc kubenswrapper[4795]: E0219 21:32:08.180316 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:08 crc kubenswrapper[4795]: E0219 21:32:08.181408 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:08 crc kubenswrapper[4795]: E0219 21:32:08.182018 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:08 crc kubenswrapper[4795]: E0219 21:32:08.182667 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:08 crc kubenswrapper[4795]: E0219 21:32:08.183391 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:08 crc kubenswrapper[4795]: I0219 21:32:08.183460 4795 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 19 21:32:08 crc kubenswrapper[4795]: E0219 21:32:08.184029 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="200ms" Feb 19 21:32:08 crc kubenswrapper[4795]: E0219 21:32:08.385283 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="400ms" Feb 19 21:32:08 crc kubenswrapper[4795]: E0219 21:32:08.785932 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="800ms" Feb 19 21:32:09 crc kubenswrapper[4795]: I0219 21:32:09.517266 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:09 crc kubenswrapper[4795]: E0219 21:32:09.586707 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="1.6s" Feb 19 21:32:11 crc kubenswrapper[4795]: E0219 21:32:11.187622 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="3.2s" Feb 19 21:32:13 crc kubenswrapper[4795]: E0219 21:32:13.555962 4795 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" volumeName="registry-storage" Feb 19 21:32:14 crc kubenswrapper[4795]: E0219 21:32:14.388627 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="6.4s" Feb 19 21:32:14 crc kubenswrapper[4795]: I0219 21:32:14.511726 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:14 crc kubenswrapper[4795]: I0219 21:32:14.512778 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:14 crc kubenswrapper[4795]: I0219 21:32:14.528632 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:14 crc kubenswrapper[4795]: I0219 21:32:14.528709 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:14 crc kubenswrapper[4795]: E0219 21:32:14.529405 4795 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:14 crc kubenswrapper[4795]: I0219 21:32:14.530394 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:14 crc kubenswrapper[4795]: W0219 21:32:14.551112 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-41655226d47bbaa32fdba77096cc0201e9e2f9d220772b2542c5846761ab512c WatchSource:0}: Error finding container 41655226d47bbaa32fdba77096cc0201e9e2f9d220772b2542c5846761ab512c: Status 404 returned error can't find the container with id 41655226d47bbaa32fdba77096cc0201e9e2f9d220772b2542c5846761ab512c Feb 19 21:32:15 crc kubenswrapper[4795]: I0219 21:32:15.030300 4795 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d0854a09a82591d70c3728b2e7a60c1677d7f76e2fa7d0b45cc959f5a6b6fb19" exitCode=0 Feb 19 21:32:15 crc kubenswrapper[4795]: I0219 21:32:15.030429 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d0854a09a82591d70c3728b2e7a60c1677d7f76e2fa7d0b45cc959f5a6b6fb19"} Feb 19 21:32:15 crc kubenswrapper[4795]: I0219 21:32:15.030612 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"41655226d47bbaa32fdba77096cc0201e9e2f9d220772b2542c5846761ab512c"} Feb 19 21:32:15 crc kubenswrapper[4795]: I0219 21:32:15.030895 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:15 crc kubenswrapper[4795]: I0219 21:32:15.030913 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:15 crc kubenswrapper[4795]: E0219 21:32:15.031364 4795 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:15 crc kubenswrapper[4795]: I0219 21:32:15.031577 4795 status_manager.go:851] "Failed to get status for pod" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 19 21:32:16 crc kubenswrapper[4795]: I0219 21:32:16.038707 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"448b13b1be0bd809e4ea29aef6a03c34c56c001d6ad87835a10a869a4dbfe46f"} Feb 19 21:32:16 crc kubenswrapper[4795]: I0219 21:32:16.039328 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"92f8ac516b57fb2f0845465ac2ad4422fde93efd295aedf8ebf83ebd04aea4a0"} Feb 19 21:32:16 crc kubenswrapper[4795]: I0219 21:32:16.039343 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"024c62e342b22d30a7eed51fb61ae33a7b8776fc7cdcb095e7c4f47b2fe5254a"} Feb 19 21:32:16 crc kubenswrapper[4795]: I0219 21:32:16.039355 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9c333b5bb4256786b46783a605904ae495f24f99183f7aab2ac51b63d05cbf7f"} Feb 19 21:32:17 crc kubenswrapper[4795]: I0219 21:32:17.050892 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ac2cb93e122d3e40ddbf9f0c954eedc03efaea57abaaf21a1796d82c5ce36949"} Feb 19 21:32:17 crc kubenswrapper[4795]: I0219 21:32:17.051072 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:17 crc kubenswrapper[4795]: I0219 21:32:17.051283 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:17 crc kubenswrapper[4795]: I0219 21:32:17.051313 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:18 crc kubenswrapper[4795]: I0219 21:32:18.110242 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 21:32:18 crc kubenswrapper[4795]: I0219 21:32:18.111196 4795 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd" exitCode=1 Feb 19 21:32:18 crc kubenswrapper[4795]: I0219 21:32:18.111373 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd"} Feb 19 21:32:18 crc kubenswrapper[4795]: I0219 21:32:18.112117 4795 scope.go:117] "RemoveContainer" containerID="8704324a3c5bb3f3023c0b000b233d650dede0825930494b987121784c094bdd" Feb 19 21:32:18 crc kubenswrapper[4795]: I0219 21:32:18.578554 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:32:19 crc kubenswrapper[4795]: I0219 21:32:19.125238 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 21:32:19 crc kubenswrapper[4795]: I0219 21:32:19.125793 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dc84cb82080f440e6c764fb2db5e202b4dce225d9e24ad46de4f51d7f0493019"} Feb 19 21:32:19 crc kubenswrapper[4795]: I0219 21:32:19.531862 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:19 crc kubenswrapper[4795]: I0219 21:32:19.531945 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:19 crc kubenswrapper[4795]: I0219 21:32:19.540732 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:21 crc kubenswrapper[4795]: I0219 21:32:21.246981 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:32:21 crc kubenswrapper[4795]: I0219 21:32:21.252986 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:32:21 crc kubenswrapper[4795]: I0219 21:32:21.848843 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:32:22 crc kubenswrapper[4795]: I0219 21:32:22.059100 4795 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:22 crc kubenswrapper[4795]: I0219 21:32:22.099993 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="794ade11-a364-41bd-85e7-564b6098af69" Feb 19 21:32:22 crc kubenswrapper[4795]: I0219 21:32:22.142461 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:22 crc kubenswrapper[4795]: I0219 21:32:22.142718 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:22 crc kubenswrapper[4795]: I0219 21:32:22.145311 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="794ade11-a364-41bd-85e7-564b6098af69" Feb 19 21:32:22 crc kubenswrapper[4795]: I0219 21:32:22.145751 4795 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://9c333b5bb4256786b46783a605904ae495f24f99183f7aab2ac51b63d05cbf7f" Feb 19 21:32:22 crc kubenswrapper[4795]: I0219 21:32:22.145783 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:23 crc kubenswrapper[4795]: I0219 21:32:23.146020 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:23 crc kubenswrapper[4795]: I0219 21:32:23.146397 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1c3c2292-d8c1-42cd-9a68-25e9dee8a334" Feb 19 21:32:23 crc kubenswrapper[4795]: I0219 21:32:23.148411 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="794ade11-a364-41bd-85e7-564b6098af69" Feb 19 21:32:29 crc kubenswrapper[4795]: I0219 21:32:29.388553 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 21:32:31 crc kubenswrapper[4795]: I0219 21:32:31.854835 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 21:32:32 crc kubenswrapper[4795]: I0219 21:32:32.233550 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 21:32:32 crc kubenswrapper[4795]: I0219 21:32:32.588474 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 21:32:33 crc kubenswrapper[4795]: I0219 21:32:33.322739 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 21:32:33 crc kubenswrapper[4795]: I0219 21:32:33.769344 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 21:32:34 crc kubenswrapper[4795]: I0219 21:32:34.150225 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 21:32:34 crc kubenswrapper[4795]: I0219 21:32:34.189785 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 21:32:34 crc kubenswrapper[4795]: I0219 21:32:34.268320 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 21:32:34 crc kubenswrapper[4795]: I0219 21:32:34.358195 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 21:32:34 crc kubenswrapper[4795]: I0219 21:32:34.466968 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 21:32:34 crc kubenswrapper[4795]: I0219 21:32:34.565676 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 21:32:34 crc kubenswrapper[4795]: I0219 21:32:34.640739 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 21:32:34 crc kubenswrapper[4795]: I0219 21:32:34.981344 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 21:32:35 crc kubenswrapper[4795]: I0219 21:32:35.058346 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 21:32:35 crc kubenswrapper[4795]: I0219 21:32:35.069259 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 21:32:35 crc kubenswrapper[4795]: I0219 21:32:35.317786 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 21:32:35 crc kubenswrapper[4795]: I0219 21:32:35.667765 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 21:32:35 crc kubenswrapper[4795]: I0219 21:32:35.688852 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 21:32:35 crc kubenswrapper[4795]: I0219 21:32:35.713948 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 21:32:35 crc kubenswrapper[4795]: I0219 21:32:35.905670 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.068946 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.122677 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.145932 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.283506 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.354905 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.386759 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.539140 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.738135 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.758492 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.872646 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.894122 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.942916 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 21:32:36 crc kubenswrapper[4795]: I0219 21:32:36.956239 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.060279 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.356471 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.473672 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.500313 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.541730 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.567160 4795 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.568859 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.661814 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.958120 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 21:32:37 crc kubenswrapper[4795]: I0219 21:32:37.989030 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.014564 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.015383 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.043927 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.082975 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.116459 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.354432 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.366999 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.381419 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.460104 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.515441 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.575159 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.576373 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.633556 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.676836 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.726613 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.737071 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.788373 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.790685 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 21:32:38 crc kubenswrapper[4795]: I0219 21:32:38.817348 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.014818 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.088277 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.104042 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.243729 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.436503 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.511759 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.696378 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.711562 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.719001 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.793947 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.805451 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.813296 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.890121 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.893284 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 21:32:39 crc kubenswrapper[4795]: I0219 21:32:39.903401 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.024226 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.116507 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.139761 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.161085 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.183411 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.209825 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.263127 4795 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.281161 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.288807 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.325268 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.347421 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.455728 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.543890 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.557750 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.588914 4795 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.734719 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 21:32:40 crc kubenswrapper[4795]: I0219 21:32:40.892283 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.136890 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.150953 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.157369 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.171790 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.205148 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.241391 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.376640 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.383620 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.417831 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.455029 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.487882 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.523565 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.529299 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.648424 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.713851 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.748810 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.756141 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.802489 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.935533 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 21:32:41 crc kubenswrapper[4795]: I0219 21:32:41.972774 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.024466 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.148163 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.148275 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.178158 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.287812 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.354046 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.465235 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.581691 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.584793 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.585523 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.656478 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.698987 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.715550 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.716708 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.764155 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.816603 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.881140 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.925781 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.946387 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.984689 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 21:32:42 crc kubenswrapper[4795]: I0219 21:32:42.995363 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.108391 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.111969 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.136472 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.141949 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.246736 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.364794 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.546539 4795 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.556629 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.556685 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.561534 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.574039 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.576494 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.576478973 podStartE2EDuration="21.576478973s" podCreationTimestamp="2026-02-19 21:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:32:43.575630949 +0000 UTC m=+274.768148823" watchObservedRunningTime="2026-02-19 21:32:43.576478973 +0000 UTC m=+274.768996837" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.627640 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.643394 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.689053 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.692450 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.761198 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.842318 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.850503 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 21:32:43 crc kubenswrapper[4795]: I0219 21:32:43.933155 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.116060 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.254790 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.270776 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.341241 4795 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.341487 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6" gracePeriod=5 Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.347703 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.410283 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.413265 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.476041 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.488866 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.555729 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.565920 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.622311 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.685815 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.702318 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.729481 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.732265 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.889863 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.920139 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 21:32:44 crc kubenswrapper[4795]: I0219 21:32:44.939451 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.063723 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.080827 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.157759 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.176056 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.203609 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.312637 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.347672 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.367741 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.439169 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.452742 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.524692 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.561694 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.593907 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.631360 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.689109 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.724079 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.755727 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.756020 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.763431 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.832770 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.880294 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.892149 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 21:32:45 crc kubenswrapper[4795]: I0219 21:32:45.904462 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.165204 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.306905 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.366006 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.459334 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.508913 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.519861 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.538465 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.545652 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.633735 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.721679 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.743918 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.765099 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.865998 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.872198 4795 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 21:32:46 crc kubenswrapper[4795]: I0219 21:32:46.947035 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.097197 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.109395 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.131446 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.169869 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.192763 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.197627 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.198756 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.276792 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.372375 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.623629 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.669799 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.681936 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.719230 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.731844 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.749330 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.786461 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.792333 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.858966 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 21:32:47 crc kubenswrapper[4795]: I0219 21:32:47.962823 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.002727 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.075573 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.117475 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.152919 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.488243 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.555994 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.755583 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.851149 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.922309 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 21:32:48 crc kubenswrapper[4795]: I0219 21:32:48.937807 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.023357 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.159123 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.207150 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.230748 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.293298 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.429679 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.437146 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.472943 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.728728 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.902434 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 21:32:49 crc kubenswrapper[4795]: I0219 21:32:49.902498 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030194 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030344 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030437 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030492 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030527 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030579 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030664 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030682 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030692 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030902 4795 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030931 4795 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.030950 4795 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.038605 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.104930 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.132494 4795 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.132625 4795 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.304246 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.304295 4795 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6" exitCode=137 Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.304342 4795 scope.go:117] "RemoveContainer" containerID="1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.304423 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.337066 4795 scope.go:117] "RemoveContainer" containerID="1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6" Feb 19 21:32:50 crc kubenswrapper[4795]: E0219 21:32:50.337557 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6\": container with ID starting with 1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6 not found: ID does not exist" containerID="1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.337589 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6"} err="failed to get container status \"1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6\": rpc error: code = NotFound desc = could not find container \"1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6\": container with ID starting with 1104528b9e716ce5addd230534335514e5ee2436ce9ed811347dd6d853cd85d6 not found: ID does not exist" Feb 19 21:32:50 crc kubenswrapper[4795]: I0219 21:32:50.858308 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 21:32:51 crc kubenswrapper[4795]: I0219 21:32:51.414889 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 21:32:51 crc kubenswrapper[4795]: I0219 21:32:51.524470 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 19 21:32:51 crc kubenswrapper[4795]: I0219 21:32:51.539951 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 21:32:51 crc kubenswrapper[4795]: I0219 21:32:51.923652 4795 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 21:32:52 crc kubenswrapper[4795]: I0219 21:32:52.105049 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 21:32:52 crc kubenswrapper[4795]: I0219 21:32:52.823399 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.030381 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zzmtm"] Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.031337 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zzmtm" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerName="registry-server" containerID="cri-o://e2b3b578e0a130165bd461f501cf99b487cf4f990927f2541dd89aab2055e28d" gracePeriod=30 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.052637 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c9sh5"] Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.053572 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c9sh5" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerName="registry-server" containerID="cri-o://e2d47055ec62ef6c48d9090ad6f5104fd8d79584ae42ab1689cf9d640447aeec" gracePeriod=30 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.080951 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfw9n"] Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.081221 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" podUID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" containerName="marketplace-operator" containerID="cri-o://79fa299dc315e3fe30e63332104d23b13faff115fe64d3c739841e3e2664edb0" gracePeriod=30 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.090596 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmzl7"] Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.091070 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bmzl7" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerName="registry-server" containerID="cri-o://124253c24d0ba04a222871263e3eb4ccb45c5fd2f1555778cc72b79051abee81" gracePeriod=30 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.096379 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5tclw"] Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.096719 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5tclw" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerName="registry-server" containerID="cri-o://7c6c77a2d2c99d511b04824d382fd22558c9e465bd1270597992cd286f73dd2f" gracePeriod=30 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.102263 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n9qlf"] Feb 19 21:33:03 crc kubenswrapper[4795]: E0219 21:33:03.102578 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.102597 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 21:33:03 crc kubenswrapper[4795]: E0219 21:33:03.102609 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" containerName="installer" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.102618 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" containerName="installer" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.102757 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.102792 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9bcd07-a45a-4f32-9d56-9ebb0931b1a6" containerName="installer" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.103348 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.115508 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n9qlf"] Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.191489 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c91304a6-fa59-4df4-aa17-d7d2f73d9103-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.191586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t9lt\" (UniqueName: \"kubernetes.io/projected/c91304a6-fa59-4df4-aa17-d7d2f73d9103-kube-api-access-7t9lt\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.191630 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c91304a6-fa59-4df4-aa17-d7d2f73d9103-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.292682 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t9lt\" (UniqueName: \"kubernetes.io/projected/c91304a6-fa59-4df4-aa17-d7d2f73d9103-kube-api-access-7t9lt\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.293029 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c91304a6-fa59-4df4-aa17-d7d2f73d9103-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.293069 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c91304a6-fa59-4df4-aa17-d7d2f73d9103-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.295291 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c91304a6-fa59-4df4-aa17-d7d2f73d9103-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.298724 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c91304a6-fa59-4df4-aa17-d7d2f73d9103-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.314140 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t9lt\" (UniqueName: \"kubernetes.io/projected/c91304a6-fa59-4df4-aa17-d7d2f73d9103-kube-api-access-7t9lt\") pod \"marketplace-operator-79b997595-n9qlf\" (UID: \"c91304a6-fa59-4df4-aa17-d7d2f73d9103\") " pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.396718 4795 generic.go:334] "Generic (PLEG): container finished" podID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerID="7c6c77a2d2c99d511b04824d382fd22558c9e465bd1270597992cd286f73dd2f" exitCode=0 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.396756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tclw" event={"ID":"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9","Type":"ContainerDied","Data":"7c6c77a2d2c99d511b04824d382fd22558c9e465bd1270597992cd286f73dd2f"} Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.398921 4795 generic.go:334] "Generic (PLEG): container finished" podID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerID="e2b3b578e0a130165bd461f501cf99b487cf4f990927f2541dd89aab2055e28d" exitCode=0 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.398975 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzmtm" event={"ID":"e8c7f503-32c4-4ca2-8435-9918cae8d931","Type":"ContainerDied","Data":"e2b3b578e0a130165bd461f501cf99b487cf4f990927f2541dd89aab2055e28d"} Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.399000 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzmtm" event={"ID":"e8c7f503-32c4-4ca2-8435-9918cae8d931","Type":"ContainerDied","Data":"1cc35dfba309639dbd91f218472ea2c5630e482b3631fbe826d36494a013aa5f"} Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.399014 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cc35dfba309639dbd91f218472ea2c5630e482b3631fbe826d36494a013aa5f" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.401108 4795 generic.go:334] "Generic (PLEG): container finished" podID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerID="124253c24d0ba04a222871263e3eb4ccb45c5fd2f1555778cc72b79051abee81" exitCode=0 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.401183 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmzl7" event={"ID":"12e5472a-2c4b-4b71-91fb-06c3d5fcca54","Type":"ContainerDied","Data":"124253c24d0ba04a222871263e3eb4ccb45c5fd2f1555778cc72b79051abee81"} Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.405367 4795 generic.go:334] "Generic (PLEG): container finished" podID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" containerID="79fa299dc315e3fe30e63332104d23b13faff115fe64d3c739841e3e2664edb0" exitCode=0 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.405449 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" event={"ID":"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca","Type":"ContainerDied","Data":"79fa299dc315e3fe30e63332104d23b13faff115fe64d3c739841e3e2664edb0"} Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.411411 4795 generic.go:334] "Generic (PLEG): container finished" podID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerID="e2d47055ec62ef6c48d9090ad6f5104fd8d79584ae42ab1689cf9d640447aeec" exitCode=0 Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.411450 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c9sh5" event={"ID":"7ae7ca82-f2b1-4fec-9f66-732017519586","Type":"ContainerDied","Data":"e2d47055ec62ef6c48d9090ad6f5104fd8d79584ae42ab1689cf9d640447aeec"} Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.411500 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c9sh5" event={"ID":"7ae7ca82-f2b1-4fec-9f66-732017519586","Type":"ContainerDied","Data":"28fe4330f368c52acd76e8871982506eeb01297bf98866cc2f51b2139ec4aa1c"} Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.411513 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28fe4330f368c52acd76e8871982506eeb01297bf98866cc2f51b2139ec4aa1c" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.491610 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.493972 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.498136 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.502119 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.508863 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.522366 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596376 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lns2m\" (UniqueName: \"kubernetes.io/projected/7ae7ca82-f2b1-4fec-9f66-732017519586-kube-api-access-lns2m\") pod \"7ae7ca82-f2b1-4fec-9f66-732017519586\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596474 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8bcw\" (UniqueName: \"kubernetes.io/projected/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-kube-api-access-t8bcw\") pod \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596532 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-utilities\") pod \"7ae7ca82-f2b1-4fec-9f66-732017519586\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596626 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-catalog-content\") pod \"7ae7ca82-f2b1-4fec-9f66-732017519586\" (UID: \"7ae7ca82-f2b1-4fec-9f66-732017519586\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596656 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-utilities\") pod \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596798 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-trusted-ca\") pod \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596861 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-utilities\") pod \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596894 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krx4q\" (UniqueName: \"kubernetes.io/projected/e8c7f503-32c4-4ca2-8435-9918cae8d931-kube-api-access-krx4q\") pod \"e8c7f503-32c4-4ca2-8435-9918cae8d931\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596951 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5vg2\" (UniqueName: \"kubernetes.io/projected/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-kube-api-access-m5vg2\") pod \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.596968 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-utilities\") pod \"e8c7f503-32c4-4ca2-8435-9918cae8d931\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.597037 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-catalog-content\") pod \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\" (UID: \"12e5472a-2c4b-4b71-91fb-06c3d5fcca54\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.597120 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ddlx\" (UniqueName: \"kubernetes.io/projected/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-kube-api-access-5ddlx\") pod \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.597154 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-catalog-content\") pod \"e8c7f503-32c4-4ca2-8435-9918cae8d931\" (UID: \"e8c7f503-32c4-4ca2-8435-9918cae8d931\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.597305 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-catalog-content\") pod \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\" (UID: \"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.597378 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-operator-metrics\") pod \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\" (UID: \"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca\") " Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.599273 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" (UID: "7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.599882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-utilities" (OuterVolumeSpecName: "utilities") pod "e8c7f503-32c4-4ca2-8435-9918cae8d931" (UID: "e8c7f503-32c4-4ca2-8435-9918cae8d931"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.600052 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-utilities" (OuterVolumeSpecName: "utilities") pod "7ae7ca82-f2b1-4fec-9f66-732017519586" (UID: "7ae7ca82-f2b1-4fec-9f66-732017519586"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.601566 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-utilities" (OuterVolumeSpecName: "utilities") pod "dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" (UID: "dd50e9de-1cec-4f7c-8308-e0e5eb7961b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.603396 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae7ca82-f2b1-4fec-9f66-732017519586-kube-api-access-lns2m" (OuterVolumeSpecName: "kube-api-access-lns2m") pod "7ae7ca82-f2b1-4fec-9f66-732017519586" (UID: "7ae7ca82-f2b1-4fec-9f66-732017519586"). InnerVolumeSpecName "kube-api-access-lns2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.603959 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-kube-api-access-m5vg2" (OuterVolumeSpecName: "kube-api-access-m5vg2") pod "12e5472a-2c4b-4b71-91fb-06c3d5fcca54" (UID: "12e5472a-2c4b-4b71-91fb-06c3d5fcca54"). InnerVolumeSpecName "kube-api-access-m5vg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.604548 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8c7f503-32c4-4ca2-8435-9918cae8d931-kube-api-access-krx4q" (OuterVolumeSpecName: "kube-api-access-krx4q") pod "e8c7f503-32c4-4ca2-8435-9918cae8d931" (UID: "e8c7f503-32c4-4ca2-8435-9918cae8d931"). InnerVolumeSpecName "kube-api-access-krx4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.604744 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-kube-api-access-5ddlx" (OuterVolumeSpecName: "kube-api-access-5ddlx") pod "dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" (UID: "dd50e9de-1cec-4f7c-8308-e0e5eb7961b9"). InnerVolumeSpecName "kube-api-access-5ddlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.605480 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" (UID: "7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.606780 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-utilities" (OuterVolumeSpecName: "utilities") pod "12e5472a-2c4b-4b71-91fb-06c3d5fcca54" (UID: "12e5472a-2c4b-4b71-91fb-06c3d5fcca54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.626810 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-kube-api-access-t8bcw" (OuterVolumeSpecName: "kube-api-access-t8bcw") pod "7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" (UID: "7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca"). InnerVolumeSpecName "kube-api-access-t8bcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.659676 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ae7ca82-f2b1-4fec-9f66-732017519586" (UID: "7ae7ca82-f2b1-4fec-9f66-732017519586"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.659802 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12e5472a-2c4b-4b71-91fb-06c3d5fcca54" (UID: "12e5472a-2c4b-4b71-91fb-06c3d5fcca54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.673001 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8c7f503-32c4-4ca2-8435-9918cae8d931" (UID: "e8c7f503-32c4-4ca2-8435-9918cae8d931"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.699951 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.699984 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lns2m\" (UniqueName: \"kubernetes.io/projected/7ae7ca82-f2b1-4fec-9f66-732017519586-kube-api-access-lns2m\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.699993 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8bcw\" (UniqueName: \"kubernetes.io/projected/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-kube-api-access-t8bcw\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700002 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700012 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700019 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae7ca82-f2b1-4fec-9f66-732017519586-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700027 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700034 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700042 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700050 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krx4q\" (UniqueName: \"kubernetes.io/projected/e8c7f503-32c4-4ca2-8435-9918cae8d931-kube-api-access-krx4q\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700058 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5vg2\" (UniqueName: \"kubernetes.io/projected/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-kube-api-access-m5vg2\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700066 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12e5472a-2c4b-4b71-91fb-06c3d5fcca54-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700074 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ddlx\" (UniqueName: \"kubernetes.io/projected/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-kube-api-access-5ddlx\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.700082 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c7f503-32c4-4ca2-8435-9918cae8d931-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.783226 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" (UID: "dd50e9de-1cec-4f7c-8308-e0e5eb7961b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.801567 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:03 crc kubenswrapper[4795]: I0219 21:33:03.906454 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n9qlf"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.418281 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bmzl7" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.418281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bmzl7" event={"ID":"12e5472a-2c4b-4b71-91fb-06c3d5fcca54","Type":"ContainerDied","Data":"e5d0662258d341fed3d8a6fa8a495b8db4873d0c1396a7edd323344c9bdab748"} Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.418788 4795 scope.go:117] "RemoveContainer" containerID="124253c24d0ba04a222871263e3eb4ccb45c5fd2f1555778cc72b79051abee81" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.420291 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.420294 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dfw9n" event={"ID":"7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca","Type":"ContainerDied","Data":"365d5c2e07de412e6c9e8f0e65078f4ceb7110e13c2cb20266daf040eaf8acbd"} Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.423615 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tclw" event={"ID":"dd50e9de-1cec-4f7c-8308-e0e5eb7961b9","Type":"ContainerDied","Data":"d567b35b55d0a2cbb795271be9aef7ece38aac167cf48328a27f71f0b916ce76"} Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.423660 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tclw" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.426181 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c9sh5" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.426197 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" event={"ID":"c91304a6-fa59-4df4-aa17-d7d2f73d9103","Type":"ContainerStarted","Data":"64e2127208ee2ac235c32326fdf8a8cd7616f06d53e17bfecc0502fae0f15b99"} Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.426239 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" event={"ID":"c91304a6-fa59-4df4-aa17-d7d2f73d9103","Type":"ContainerStarted","Data":"63915007f0fc176b803854f4170846dae0c676210a4c7f26666d321c1ca4538c"} Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.426479 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzmtm" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.443502 4795 scope.go:117] "RemoveContainer" containerID="cb4588aca4785a571a304fb53b83b17a5d4fe720455aa15c94f9117f8cbe5514" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.446613 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" podStartSLOduration=1.446592772 podStartE2EDuration="1.446592772s" podCreationTimestamp="2026-02-19 21:33:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:33:04.445256785 +0000 UTC m=+295.637774649" watchObservedRunningTime="2026-02-19 21:33:04.446592772 +0000 UTC m=+295.639110636" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.467845 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmzl7"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.471341 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bmzl7"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.480090 4795 scope.go:117] "RemoveContainer" containerID="7d3a5224f2284bc0e0b27ce31d0a2ef7f65f8de8a465113fd61e0813421d7fde" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.496417 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c9sh5"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.504321 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c9sh5"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.511672 4795 scope.go:117] "RemoveContainer" containerID="79fa299dc315e3fe30e63332104d23b13faff115fe64d3c739841e3e2664edb0" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.521037 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5tclw"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.524133 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5tclw"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.530218 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zzmtm"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.534756 4795 scope.go:117] "RemoveContainer" containerID="7c6c77a2d2c99d511b04824d382fd22558c9e465bd1270597992cd286f73dd2f" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.536073 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zzmtm"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.543679 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfw9n"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.551722 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dfw9n"] Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.564359 4795 scope.go:117] "RemoveContainer" containerID="1fec75a7b046544f542f990a4f1a07786af212b43341a6cab76f9622d84d42c4" Feb 19 21:33:04 crc kubenswrapper[4795]: I0219 21:33:04.584997 4795 scope.go:117] "RemoveContainer" containerID="e32b30648a4cba474dcfdca283954a8496e4f09ea18f9bc9f3563fdbbe7453f6" Feb 19 21:33:05 crc kubenswrapper[4795]: I0219 21:33:05.435903 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:05 crc kubenswrapper[4795]: I0219 21:33:05.439281 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-n9qlf" Feb 19 21:33:05 crc kubenswrapper[4795]: I0219 21:33:05.518338 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" path="/var/lib/kubelet/pods/12e5472a-2c4b-4b71-91fb-06c3d5fcca54/volumes" Feb 19 21:33:05 crc kubenswrapper[4795]: I0219 21:33:05.518938 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" path="/var/lib/kubelet/pods/7ae7ca82-f2b1-4fec-9f66-732017519586/volumes" Feb 19 21:33:05 crc kubenswrapper[4795]: I0219 21:33:05.519519 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" path="/var/lib/kubelet/pods/7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca/volumes" Feb 19 21:33:05 crc kubenswrapper[4795]: I0219 21:33:05.520345 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" path="/var/lib/kubelet/pods/dd50e9de-1cec-4f7c-8308-e0e5eb7961b9/volumes" Feb 19 21:33:05 crc kubenswrapper[4795]: I0219 21:33:05.520868 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" path="/var/lib/kubelet/pods/e8c7f503-32c4-4ca2-8435-9918cae8d931/volumes" Feb 19 21:33:09 crc kubenswrapper[4795]: I0219 21:33:09.299196 4795 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.290215 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7ph8l"] Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.290686 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" podUID="86ffb50f-47f6-47b2-9141-1de9999a13e0" containerName="controller-manager" containerID="cri-o://71e5ceb34779a2773b15e3bbb07890cf9d8cbc1dd13d422577cbf3017f33a99e" gracePeriod=30 Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.420611 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw"] Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.421060 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" podUID="102f7fb5-3031-4853-b112-2aa910aa63a7" containerName="route-controller-manager" containerID="cri-o://09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b" gracePeriod=30 Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.493634 4795 generic.go:334] "Generic (PLEG): container finished" podID="86ffb50f-47f6-47b2-9141-1de9999a13e0" containerID="71e5ceb34779a2773b15e3bbb07890cf9d8cbc1dd13d422577cbf3017f33a99e" exitCode=0 Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.493675 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" event={"ID":"86ffb50f-47f6-47b2-9141-1de9999a13e0","Type":"ContainerDied","Data":"71e5ceb34779a2773b15e3bbb07890cf9d8cbc1dd13d422577cbf3017f33a99e"} Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.677514 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.747953 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rvcj\" (UniqueName: \"kubernetes.io/projected/86ffb50f-47f6-47b2-9141-1de9999a13e0-kube-api-access-4rvcj\") pod \"86ffb50f-47f6-47b2-9141-1de9999a13e0\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.747993 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-proxy-ca-bundles\") pod \"86ffb50f-47f6-47b2-9141-1de9999a13e0\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.748055 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-client-ca\") pod \"86ffb50f-47f6-47b2-9141-1de9999a13e0\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.748116 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-config\") pod \"86ffb50f-47f6-47b2-9141-1de9999a13e0\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.748146 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86ffb50f-47f6-47b2-9141-1de9999a13e0-serving-cert\") pod \"86ffb50f-47f6-47b2-9141-1de9999a13e0\" (UID: \"86ffb50f-47f6-47b2-9141-1de9999a13e0\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.749013 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-config" (OuterVolumeSpecName: "config") pod "86ffb50f-47f6-47b2-9141-1de9999a13e0" (UID: "86ffb50f-47f6-47b2-9141-1de9999a13e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.749083 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-client-ca" (OuterVolumeSpecName: "client-ca") pod "86ffb50f-47f6-47b2-9141-1de9999a13e0" (UID: "86ffb50f-47f6-47b2-9141-1de9999a13e0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.749333 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "86ffb50f-47f6-47b2-9141-1de9999a13e0" (UID: "86ffb50f-47f6-47b2-9141-1de9999a13e0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.753682 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86ffb50f-47f6-47b2-9141-1de9999a13e0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "86ffb50f-47f6-47b2-9141-1de9999a13e0" (UID: "86ffb50f-47f6-47b2-9141-1de9999a13e0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.754344 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86ffb50f-47f6-47b2-9141-1de9999a13e0-kube-api-access-4rvcj" (OuterVolumeSpecName: "kube-api-access-4rvcj") pod "86ffb50f-47f6-47b2-9141-1de9999a13e0" (UID: "86ffb50f-47f6-47b2-9141-1de9999a13e0"). InnerVolumeSpecName "kube-api-access-4rvcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.772065 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.848737 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-config\") pod \"102f7fb5-3031-4853-b112-2aa910aa63a7\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.848805 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/102f7fb5-3031-4853-b112-2aa910aa63a7-serving-cert\") pod \"102f7fb5-3031-4853-b112-2aa910aa63a7\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.848849 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-client-ca\") pod \"102f7fb5-3031-4853-b112-2aa910aa63a7\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.848924 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-546h9\" (UniqueName: \"kubernetes.io/projected/102f7fb5-3031-4853-b112-2aa910aa63a7-kube-api-access-546h9\") pod \"102f7fb5-3031-4853-b112-2aa910aa63a7\" (UID: \"102f7fb5-3031-4853-b112-2aa910aa63a7\") " Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.849629 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-client-ca" (OuterVolumeSpecName: "client-ca") pod "102f7fb5-3031-4853-b112-2aa910aa63a7" (UID: "102f7fb5-3031-4853-b112-2aa910aa63a7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.849650 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-config" (OuterVolumeSpecName: "config") pod "102f7fb5-3031-4853-b112-2aa910aa63a7" (UID: "102f7fb5-3031-4853-b112-2aa910aa63a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.849918 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.849943 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86ffb50f-47f6-47b2-9141-1de9999a13e0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.849957 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rvcj\" (UniqueName: \"kubernetes.io/projected/86ffb50f-47f6-47b2-9141-1de9999a13e0-kube-api-access-4rvcj\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.849971 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.849982 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.849993 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86ffb50f-47f6-47b2-9141-1de9999a13e0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.850004 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/102f7fb5-3031-4853-b112-2aa910aa63a7-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.852112 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/102f7fb5-3031-4853-b112-2aa910aa63a7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "102f7fb5-3031-4853-b112-2aa910aa63a7" (UID: "102f7fb5-3031-4853-b112-2aa910aa63a7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.852413 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/102f7fb5-3031-4853-b112-2aa910aa63a7-kube-api-access-546h9" (OuterVolumeSpecName: "kube-api-access-546h9") pod "102f7fb5-3031-4853-b112-2aa910aa63a7" (UID: "102f7fb5-3031-4853-b112-2aa910aa63a7"). InnerVolumeSpecName "kube-api-access-546h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.951489 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/102f7fb5-3031-4853-b112-2aa910aa63a7-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:15 crc kubenswrapper[4795]: I0219 21:33:15.951527 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-546h9\" (UniqueName: \"kubernetes.io/projected/102f7fb5-3031-4853-b112-2aa910aa63a7-kube-api-access-546h9\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.499296 4795 generic.go:334] "Generic (PLEG): container finished" podID="102f7fb5-3031-4853-b112-2aa910aa63a7" containerID="09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b" exitCode=0 Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.499339 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.499382 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" event={"ID":"102f7fb5-3031-4853-b112-2aa910aa63a7","Type":"ContainerDied","Data":"09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b"} Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.499412 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw" event={"ID":"102f7fb5-3031-4853-b112-2aa910aa63a7","Type":"ContainerDied","Data":"f4d807af544e927e81a81905631510e6f7454a6d612cc5078b9fdd6b9b356c32"} Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.499429 4795 scope.go:117] "RemoveContainer" containerID="09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.500745 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" event={"ID":"86ffb50f-47f6-47b2-9141-1de9999a13e0","Type":"ContainerDied","Data":"f50c3553621e34238711ac41e2e592ef162e4af963002aedb152ce56da5992e5"} Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.500819 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7ph8l" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.513819 4795 scope.go:117] "RemoveContainer" containerID="09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.514435 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b\": container with ID starting with 09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b not found: ID does not exist" containerID="09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.514515 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b"} err="failed to get container status \"09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b\": rpc error: code = NotFound desc = could not find container \"09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b\": container with ID starting with 09bfebe5361b480de9eacdb44947d498846179bb75edaddc250a409a01766d3b not found: ID does not exist" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.514553 4795 scope.go:117] "RemoveContainer" containerID="71e5ceb34779a2773b15e3bbb07890cf9d8cbc1dd13d422577cbf3017f33a99e" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.534104 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7ph8l"] Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.540251 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7ph8l"] Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.543790 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw"] Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.550734 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-vtqjw"] Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.590676 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-4zhjv"] Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591107 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591127 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591138 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerName="extract-content" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591143 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerName="extract-content" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591155 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerName="extract-utilities" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591164 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerName="extract-utilities" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591185 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerName="extract-content" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591191 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerName="extract-content" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591202 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" containerName="marketplace-operator" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591207 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" containerName="marketplace-operator" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591215 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591221 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591230 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591236 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591245 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="102f7fb5-3031-4853-b112-2aa910aa63a7" containerName="route-controller-manager" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591250 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="102f7fb5-3031-4853-b112-2aa910aa63a7" containerName="route-controller-manager" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591282 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ffb50f-47f6-47b2-9141-1de9999a13e0" containerName="controller-manager" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591289 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ffb50f-47f6-47b2-9141-1de9999a13e0" containerName="controller-manager" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591297 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerName="extract-utilities" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591303 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerName="extract-utilities" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591309 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerName="extract-content" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591316 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerName="extract-content" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591323 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerName="extract-content" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591329 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerName="extract-content" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591337 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591343 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591351 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerName="extract-utilities" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591357 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerName="extract-utilities" Feb 19 21:33:16 crc kubenswrapper[4795]: E0219 21:33:16.591364 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerName="extract-utilities" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591370 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerName="extract-utilities" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591442 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae7ca82-f2b1-4fec-9f66-732017519586" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591452 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="12e5472a-2c4b-4b71-91fb-06c3d5fcca54" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591460 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f97dcad-baff-4ef7-a4c9-2443e7f6a8ca" containerName="marketplace-operator" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591469 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd50e9de-1cec-4f7c-8308-e0e5eb7961b9" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591478 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ffb50f-47f6-47b2-9141-1de9999a13e0" containerName="controller-manager" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591484 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c7f503-32c4-4ca2-8435-9918cae8d931" containerName="registry-server" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591492 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="102f7fb5-3031-4853-b112-2aa910aa63a7" containerName="route-controller-manager" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.591826 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.595425 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.595567 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.595954 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.596243 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.596433 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.598404 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj"] Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.599209 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.599454 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.601773 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.602217 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.602230 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.602230 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.602244 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.602245 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.606325 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-4zhjv"] Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.609978 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.614670 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj"] Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.660861 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-serving-cert\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.660906 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-client-ca\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.660937 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-client-ca\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.660987 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ngvm\" (UniqueName: \"kubernetes.io/projected/09f05756-63fb-4c1b-b763-065b4a66ceff-kube-api-access-7ngvm\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.661011 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09f05756-63fb-4c1b-b763-065b4a66ceff-serving-cert\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.661040 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-proxy-ca-bundles\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.661065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-config\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.661104 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-config\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.661134 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk96h\" (UniqueName: \"kubernetes.io/projected/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-kube-api-access-lk96h\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.761731 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ngvm\" (UniqueName: \"kubernetes.io/projected/09f05756-63fb-4c1b-b763-065b4a66ceff-kube-api-access-7ngvm\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.762492 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09f05756-63fb-4c1b-b763-065b4a66ceff-serving-cert\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.762526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-proxy-ca-bundles\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.762548 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-config\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.762576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-config\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.762597 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk96h\" (UniqueName: \"kubernetes.io/projected/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-kube-api-access-lk96h\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.762619 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-serving-cert\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.762633 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-client-ca\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.762656 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-client-ca\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.763509 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-client-ca\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.765231 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-config\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.768324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-serving-cert\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.768769 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-client-ca\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.769099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09f05756-63fb-4c1b-b763-065b4a66ceff-serving-cert\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.769404 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-config\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.769591 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-proxy-ca-bundles\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.785058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ngvm\" (UniqueName: \"kubernetes.io/projected/09f05756-63fb-4c1b-b763-065b4a66ceff-kube-api-access-7ngvm\") pod \"controller-manager-658fd5994d-4zhjv\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.786763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk96h\" (UniqueName: \"kubernetes.io/projected/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-kube-api-access-lk96h\") pod \"route-controller-manager-75664bd6d9-j5lrj\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.916419 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:16 crc kubenswrapper[4795]: I0219 21:33:16.935800 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:17 crc kubenswrapper[4795]: I0219 21:33:17.380828 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-4zhjv"] Feb 19 21:33:17 crc kubenswrapper[4795]: I0219 21:33:17.391124 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj"] Feb 19 21:33:17 crc kubenswrapper[4795]: W0219 21:33:17.394945 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bb64b6f_6a96_4fe9_9fda_67a962a579bb.slice/crio-9c0f3d2b587ecd0963a52b1acde19cffd9f8d935d94f3195fd7e7d8ce920cee0 WatchSource:0}: Error finding container 9c0f3d2b587ecd0963a52b1acde19cffd9f8d935d94f3195fd7e7d8ce920cee0: Status 404 returned error can't find the container with id 9c0f3d2b587ecd0963a52b1acde19cffd9f8d935d94f3195fd7e7d8ce920cee0 Feb 19 21:33:17 crc kubenswrapper[4795]: I0219 21:33:17.508181 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" event={"ID":"09f05756-63fb-4c1b-b763-065b4a66ceff","Type":"ContainerStarted","Data":"82e2340862c779da3d50db9a4a4c3d7bdeb32be3ace73f6532ae9bd0d9d449b3"} Feb 19 21:33:17 crc kubenswrapper[4795]: I0219 21:33:17.509448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" event={"ID":"9bb64b6f-6a96-4fe9-9fda-67a962a579bb","Type":"ContainerStarted","Data":"9c0f3d2b587ecd0963a52b1acde19cffd9f8d935d94f3195fd7e7d8ce920cee0"} Feb 19 21:33:17 crc kubenswrapper[4795]: I0219 21:33:17.524735 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="102f7fb5-3031-4853-b112-2aa910aa63a7" path="/var/lib/kubelet/pods/102f7fb5-3031-4853-b112-2aa910aa63a7/volumes" Feb 19 21:33:17 crc kubenswrapper[4795]: I0219 21:33:17.525588 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86ffb50f-47f6-47b2-9141-1de9999a13e0" path="/var/lib/kubelet/pods/86ffb50f-47f6-47b2-9141-1de9999a13e0/volumes" Feb 19 21:33:18 crc kubenswrapper[4795]: I0219 21:33:18.523274 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" event={"ID":"9bb64b6f-6a96-4fe9-9fda-67a962a579bb","Type":"ContainerStarted","Data":"7448f9d97513bea9320be3400b42d577c96f3d79a9463cc8c66045316a8904ba"} Feb 19 21:33:18 crc kubenswrapper[4795]: I0219 21:33:18.523740 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:18 crc kubenswrapper[4795]: I0219 21:33:18.524418 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" event={"ID":"09f05756-63fb-4c1b-b763-065b4a66ceff","Type":"ContainerStarted","Data":"60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5"} Feb 19 21:33:18 crc kubenswrapper[4795]: I0219 21:33:18.524668 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:18 crc kubenswrapper[4795]: I0219 21:33:18.534754 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:33:18 crc kubenswrapper[4795]: I0219 21:33:18.561884 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" podStartSLOduration=3.561870136 podStartE2EDuration="3.561870136s" podCreationTimestamp="2026-02-19 21:33:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:33:18.558776701 +0000 UTC m=+309.751294565" watchObservedRunningTime="2026-02-19 21:33:18.561870136 +0000 UTC m=+309.754388000" Feb 19 21:33:18 crc kubenswrapper[4795]: I0219 21:33:18.756459 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:18 crc kubenswrapper[4795]: I0219 21:33:18.776032 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" podStartSLOduration=3.776016157 podStartE2EDuration="3.776016157s" podCreationTimestamp="2026-02-19 21:33:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:33:18.578073292 +0000 UTC m=+309.770591156" watchObservedRunningTime="2026-02-19 21:33:18.776016157 +0000 UTC m=+309.968534021" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.104856 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fgxv6"] Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.107376 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.124404 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fgxv6"] Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.213177 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c072476d-28b8-4ac2-9faa-2a8f54071b38-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.213229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcdds\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-kube-api-access-vcdds\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.213256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.213287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c072476d-28b8-4ac2-9faa-2a8f54071b38-registry-certificates\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.213501 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c072476d-28b8-4ac2-9faa-2a8f54071b38-trusted-ca\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.213562 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-bound-sa-token\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.213619 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-registry-tls\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.213643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c072476d-28b8-4ac2-9faa-2a8f54071b38-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.237374 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.314929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c072476d-28b8-4ac2-9faa-2a8f54071b38-trusted-ca\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.314992 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-bound-sa-token\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.315015 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-registry-tls\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.315032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c072476d-28b8-4ac2-9faa-2a8f54071b38-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.315059 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c072476d-28b8-4ac2-9faa-2a8f54071b38-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.315080 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcdds\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-kube-api-access-vcdds\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.315106 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c072476d-28b8-4ac2-9faa-2a8f54071b38-registry-certificates\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.316369 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c072476d-28b8-4ac2-9faa-2a8f54071b38-registry-certificates\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.316735 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c072476d-28b8-4ac2-9faa-2a8f54071b38-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.317074 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c072476d-28b8-4ac2-9faa-2a8f54071b38-trusted-ca\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.322057 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c072476d-28b8-4ac2-9faa-2a8f54071b38-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.324691 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-registry-tls\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.343046 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-bound-sa-token\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.343496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcdds\" (UniqueName: \"kubernetes.io/projected/c072476d-28b8-4ac2-9faa-2a8f54071b38-kube-api-access-vcdds\") pod \"image-registry-66df7c8f76-fgxv6\" (UID: \"c072476d-28b8-4ac2-9faa-2a8f54071b38\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.442942 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:31 crc kubenswrapper[4795]: I0219 21:33:31.826548 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fgxv6"] Feb 19 21:33:31 crc kubenswrapper[4795]: W0219 21:33:31.832619 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc072476d_28b8_4ac2_9faa_2a8f54071b38.slice/crio-7e91a0576a8105ae06a3f44088f1585b6b67e0ec5d2db89d19e3476e1179878d WatchSource:0}: Error finding container 7e91a0576a8105ae06a3f44088f1585b6b67e0ec5d2db89d19e3476e1179878d: Status 404 returned error can't find the container with id 7e91a0576a8105ae06a3f44088f1585b6b67e0ec5d2db89d19e3476e1179878d Feb 19 21:33:32 crc kubenswrapper[4795]: I0219 21:33:32.589851 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" event={"ID":"c072476d-28b8-4ac2-9faa-2a8f54071b38","Type":"ContainerStarted","Data":"962c78e81bf040099743ca07798d02b09e7ff69a18e291185ff7292a84b3a4bd"} Feb 19 21:33:32 crc kubenswrapper[4795]: I0219 21:33:32.589937 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:32 crc kubenswrapper[4795]: I0219 21:33:32.589947 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" event={"ID":"c072476d-28b8-4ac2-9faa-2a8f54071b38","Type":"ContainerStarted","Data":"7e91a0576a8105ae06a3f44088f1585b6b67e0ec5d2db89d19e3476e1179878d"} Feb 19 21:33:32 crc kubenswrapper[4795]: I0219 21:33:32.611391 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" podStartSLOduration=1.6113737399999999 podStartE2EDuration="1.61137374s" podCreationTimestamp="2026-02-19 21:33:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:33:32.608694386 +0000 UTC m=+323.801212250" watchObservedRunningTime="2026-02-19 21:33:32.61137374 +0000 UTC m=+323.803891604" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.287718 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj"] Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.288582 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" podUID="9bb64b6f-6a96-4fe9-9fda-67a962a579bb" containerName="route-controller-manager" containerID="cri-o://7448f9d97513bea9320be3400b42d577c96f3d79a9463cc8c66045316a8904ba" gracePeriod=30 Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.606363 4795 generic.go:334] "Generic (PLEG): container finished" podID="9bb64b6f-6a96-4fe9-9fda-67a962a579bb" containerID="7448f9d97513bea9320be3400b42d577c96f3d79a9463cc8c66045316a8904ba" exitCode=0 Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.606403 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" event={"ID":"9bb64b6f-6a96-4fe9-9fda-67a962a579bb","Type":"ContainerDied","Data":"7448f9d97513bea9320be3400b42d577c96f3d79a9463cc8c66045316a8904ba"} Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.741219 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.889315 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-serving-cert\") pod \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.889382 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-client-ca\") pod \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.889468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk96h\" (UniqueName: \"kubernetes.io/projected/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-kube-api-access-lk96h\") pod \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.889518 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-config\") pod \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\" (UID: \"9bb64b6f-6a96-4fe9-9fda-67a962a579bb\") " Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.890118 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-client-ca" (OuterVolumeSpecName: "client-ca") pod "9bb64b6f-6a96-4fe9-9fda-67a962a579bb" (UID: "9bb64b6f-6a96-4fe9-9fda-67a962a579bb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.890145 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-config" (OuterVolumeSpecName: "config") pod "9bb64b6f-6a96-4fe9-9fda-67a962a579bb" (UID: "9bb64b6f-6a96-4fe9-9fda-67a962a579bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.894448 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9bb64b6f-6a96-4fe9-9fda-67a962a579bb" (UID: "9bb64b6f-6a96-4fe9-9fda-67a962a579bb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.894605 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-kube-api-access-lk96h" (OuterVolumeSpecName: "kube-api-access-lk96h") pod "9bb64b6f-6a96-4fe9-9fda-67a962a579bb" (UID: "9bb64b6f-6a96-4fe9-9fda-67a962a579bb"). InnerVolumeSpecName "kube-api-access-lk96h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.990992 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.991023 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk96h\" (UniqueName: \"kubernetes.io/projected/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-kube-api-access-lk96h\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.991035 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:35 crc kubenswrapper[4795]: I0219 21:33:35.991042 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb64b6f-6a96-4fe9-9fda-67a962a579bb-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.613307 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" event={"ID":"9bb64b6f-6a96-4fe9-9fda-67a962a579bb","Type":"ContainerDied","Data":"9c0f3d2b587ecd0963a52b1acde19cffd9f8d935d94f3195fd7e7d8ce920cee0"} Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.613369 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.613374 4795 scope.go:117] "RemoveContainer" containerID="7448f9d97513bea9320be3400b42d577c96f3d79a9463cc8c66045316a8904ba" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.639944 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2"] Feb 19 21:33:36 crc kubenswrapper[4795]: E0219 21:33:36.640280 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb64b6f-6a96-4fe9-9fda-67a962a579bb" containerName="route-controller-manager" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.640310 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb64b6f-6a96-4fe9-9fda-67a962a579bb" containerName="route-controller-manager" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.640463 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb64b6f-6a96-4fe9-9fda-67a962a579bb" containerName="route-controller-manager" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.640993 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.643493 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.643614 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.645359 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.645793 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.645922 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.648466 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.650546 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2"] Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.657507 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj"] Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.662413 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-j5lrj"] Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.801499 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n9kz\" (UniqueName: \"kubernetes.io/projected/8e80efae-f1ac-40f9-ad38-61dc2821499e-kube-api-access-6n9kz\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.801548 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e80efae-f1ac-40f9-ad38-61dc2821499e-client-ca\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.801619 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e80efae-f1ac-40f9-ad38-61dc2821499e-config\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.801643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e80efae-f1ac-40f9-ad38-61dc2821499e-serving-cert\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.902716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e80efae-f1ac-40f9-ad38-61dc2821499e-client-ca\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.902835 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e80efae-f1ac-40f9-ad38-61dc2821499e-config\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.902868 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e80efae-f1ac-40f9-ad38-61dc2821499e-serving-cert\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.902907 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n9kz\" (UniqueName: \"kubernetes.io/projected/8e80efae-f1ac-40f9-ad38-61dc2821499e-kube-api-access-6n9kz\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.904139 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e80efae-f1ac-40f9-ad38-61dc2821499e-client-ca\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.904335 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e80efae-f1ac-40f9-ad38-61dc2821499e-config\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.907395 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e80efae-f1ac-40f9-ad38-61dc2821499e-serving-cert\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.917816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n9kz\" (UniqueName: \"kubernetes.io/projected/8e80efae-f1ac-40f9-ad38-61dc2821499e-kube-api-access-6n9kz\") pod \"route-controller-manager-84d5f88f56-vjpj2\" (UID: \"8e80efae-f1ac-40f9-ad38-61dc2821499e\") " pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:36 crc kubenswrapper[4795]: I0219 21:33:36.956577 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:37 crc kubenswrapper[4795]: I0219 21:33:37.388655 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2"] Feb 19 21:33:37 crc kubenswrapper[4795]: I0219 21:33:37.520395 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bb64b6f-6a96-4fe9-9fda-67a962a579bb" path="/var/lib/kubelet/pods/9bb64b6f-6a96-4fe9-9fda-67a962a579bb/volumes" Feb 19 21:33:37 crc kubenswrapper[4795]: I0219 21:33:37.619365 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" event={"ID":"8e80efae-f1ac-40f9-ad38-61dc2821499e","Type":"ContainerStarted","Data":"3db54fe9b4bcdd2dfcde25b1c782209dd46ca7eb8c297c01d7a470c0b950e023"} Feb 19 21:33:37 crc kubenswrapper[4795]: I0219 21:33:37.619486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" event={"ID":"8e80efae-f1ac-40f9-ad38-61dc2821499e","Type":"ContainerStarted","Data":"ff2a96362a19fc9b02cca871a18d60ace8d1f71c1784478eca8f7671f28a2556"} Feb 19 21:33:37 crc kubenswrapper[4795]: I0219 21:33:37.620812 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:37 crc kubenswrapper[4795]: I0219 21:33:37.622349 4795 patch_prober.go:28] interesting pod/route-controller-manager-84d5f88f56-vjpj2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" start-of-body= Feb 19 21:33:37 crc kubenswrapper[4795]: I0219 21:33:37.622393 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" podUID="8e80efae-f1ac-40f9-ad38-61dc2821499e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" Feb 19 21:33:37 crc kubenswrapper[4795]: I0219 21:33:37.638650 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" podStartSLOduration=2.638635696 podStartE2EDuration="2.638635696s" podCreationTimestamp="2026-02-19 21:33:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:33:37.63542012 +0000 UTC m=+328.827937984" watchObservedRunningTime="2026-02-19 21:33:37.638635696 +0000 UTC m=+328.831153560" Feb 19 21:33:38 crc kubenswrapper[4795]: I0219 21:33:38.630075 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" Feb 19 21:33:51 crc kubenswrapper[4795]: I0219 21:33:51.452537 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-fgxv6" Feb 19 21:33:51 crc kubenswrapper[4795]: I0219 21:33:51.511012 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4h49m"] Feb 19 21:33:58 crc kubenswrapper[4795]: I0219 21:33:58.427910 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:33:58 crc kubenswrapper[4795]: I0219 21:33:58.428553 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.147566 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mdtnr"] Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.149479 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.151664 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.168067 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mdtnr"] Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.276431 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw6pd\" (UniqueName: \"kubernetes.io/projected/f457fe15-4099-4d77-8140-3297bee0a182-kube-api-access-vw6pd\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.276847 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f457fe15-4099-4d77-8140-3297bee0a182-catalog-content\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.277140 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f457fe15-4099-4d77-8140-3297bee0a182-utilities\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.343332 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p266t"] Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.344275 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.346636 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.359237 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p266t"] Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.395554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f457fe15-4099-4d77-8140-3297bee0a182-catalog-content\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.395665 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f457fe15-4099-4d77-8140-3297bee0a182-utilities\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.395725 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw6pd\" (UniqueName: \"kubernetes.io/projected/f457fe15-4099-4d77-8140-3297bee0a182-kube-api-access-vw6pd\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.396125 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f457fe15-4099-4d77-8140-3297bee0a182-utilities\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.396371 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f457fe15-4099-4d77-8140-3297bee0a182-catalog-content\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.425131 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw6pd\" (UniqueName: \"kubernetes.io/projected/f457fe15-4099-4d77-8140-3297bee0a182-kube-api-access-vw6pd\") pod \"certified-operators-mdtnr\" (UID: \"f457fe15-4099-4d77-8140-3297bee0a182\") " pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.469989 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.496788 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqmwl\" (UniqueName: \"kubernetes.io/projected/432a371d-d143-4da7-9332-682f52b39381-kube-api-access-mqmwl\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.496943 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/432a371d-d143-4da7-9332-682f52b39381-utilities\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.497001 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/432a371d-d143-4da7-9332-682f52b39381-catalog-content\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.598200 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqmwl\" (UniqueName: \"kubernetes.io/projected/432a371d-d143-4da7-9332-682f52b39381-kube-api-access-mqmwl\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.598639 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/432a371d-d143-4da7-9332-682f52b39381-utilities\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.598672 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/432a371d-d143-4da7-9332-682f52b39381-catalog-content\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.599341 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/432a371d-d143-4da7-9332-682f52b39381-catalog-content\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.599350 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/432a371d-d143-4da7-9332-682f52b39381-utilities\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.619920 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqmwl\" (UniqueName: \"kubernetes.io/projected/432a371d-d143-4da7-9332-682f52b39381-kube-api-access-mqmwl\") pod \"community-operators-p266t\" (UID: \"432a371d-d143-4da7-9332-682f52b39381\") " pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.704521 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:13 crc kubenswrapper[4795]: I0219 21:34:13.876118 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mdtnr"] Feb 19 21:34:14 crc kubenswrapper[4795]: I0219 21:34:14.060755 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p266t"] Feb 19 21:34:14 crc kubenswrapper[4795]: W0219 21:34:14.096558 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod432a371d_d143_4da7_9332_682f52b39381.slice/crio-5cb7b1d8a97bf4b9862704ff25f694d6972eb35858aac2d0da3339e985d59a7e WatchSource:0}: Error finding container 5cb7b1d8a97bf4b9862704ff25f694d6972eb35858aac2d0da3339e985d59a7e: Status 404 returned error can't find the container with id 5cb7b1d8a97bf4b9862704ff25f694d6972eb35858aac2d0da3339e985d59a7e Feb 19 21:34:14 crc kubenswrapper[4795]: I0219 21:34:14.818719 4795 generic.go:334] "Generic (PLEG): container finished" podID="f457fe15-4099-4d77-8140-3297bee0a182" containerID="9ef1064bce8bd4a67adcc876213ec91f6b1c19e931d783289fdcbf08c748684f" exitCode=0 Feb 19 21:34:14 crc kubenswrapper[4795]: I0219 21:34:14.818821 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdtnr" event={"ID":"f457fe15-4099-4d77-8140-3297bee0a182","Type":"ContainerDied","Data":"9ef1064bce8bd4a67adcc876213ec91f6b1c19e931d783289fdcbf08c748684f"} Feb 19 21:34:14 crc kubenswrapper[4795]: I0219 21:34:14.818908 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdtnr" event={"ID":"f457fe15-4099-4d77-8140-3297bee0a182","Type":"ContainerStarted","Data":"4856b161db877886e34629abe64d1bcb0728155b7b8faf257539528a50573274"} Feb 19 21:34:14 crc kubenswrapper[4795]: I0219 21:34:14.821901 4795 generic.go:334] "Generic (PLEG): container finished" podID="432a371d-d143-4da7-9332-682f52b39381" containerID="daee02a63e745db155d7de8aac8f4ca682b9ef8ec8d76969ae8d79cf5644783d" exitCode=0 Feb 19 21:34:14 crc kubenswrapper[4795]: I0219 21:34:14.821934 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p266t" event={"ID":"432a371d-d143-4da7-9332-682f52b39381","Type":"ContainerDied","Data":"daee02a63e745db155d7de8aac8f4ca682b9ef8ec8d76969ae8d79cf5644783d"} Feb 19 21:34:14 crc kubenswrapper[4795]: I0219 21:34:14.821967 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p266t" event={"ID":"432a371d-d143-4da7-9332-682f52b39381","Type":"ContainerStarted","Data":"5cb7b1d8a97bf4b9862704ff25f694d6972eb35858aac2d0da3339e985d59a7e"} Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.291424 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-4zhjv"] Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.291858 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" podUID="09f05756-63fb-4c1b-b763-065b4a66ceff" containerName="controller-manager" containerID="cri-o://60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5" gracePeriod=30 Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.341869 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r8t7x"] Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.343051 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.344812 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.356274 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r8t7x"] Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.430470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9xpk\" (UniqueName: \"kubernetes.io/projected/4941d783-94cd-4a5c-a124-5c8751cc8494-kube-api-access-k9xpk\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.430605 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4941d783-94cd-4a5c-a124-5c8751cc8494-catalog-content\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.430798 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4941d783-94cd-4a5c-a124-5c8751cc8494-utilities\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.531878 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4941d783-94cd-4a5c-a124-5c8751cc8494-utilities\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.532320 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9xpk\" (UniqueName: \"kubernetes.io/projected/4941d783-94cd-4a5c-a124-5c8751cc8494-kube-api-access-k9xpk\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.532358 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4941d783-94cd-4a5c-a124-5c8751cc8494-catalog-content\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.532382 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4941d783-94cd-4a5c-a124-5c8751cc8494-utilities\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.532787 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4941d783-94cd-4a5c-a124-5c8751cc8494-catalog-content\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.557299 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9xpk\" (UniqueName: \"kubernetes.io/projected/4941d783-94cd-4a5c-a124-5c8751cc8494-kube-api-access-k9xpk\") pod \"redhat-marketplace-r8t7x\" (UID: \"4941d783-94cd-4a5c-a124-5c8751cc8494\") " pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.653129 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.734847 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-proxy-ca-bundles\") pod \"09f05756-63fb-4c1b-b763-065b4a66ceff\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.734919 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-config\") pod \"09f05756-63fb-4c1b-b763-065b4a66ceff\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.734965 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09f05756-63fb-4c1b-b763-065b4a66ceff-serving-cert\") pod \"09f05756-63fb-4c1b-b763-065b4a66ceff\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.734986 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-client-ca\") pod \"09f05756-63fb-4c1b-b763-065b4a66ceff\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.735022 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ngvm\" (UniqueName: \"kubernetes.io/projected/09f05756-63fb-4c1b-b763-065b4a66ceff-kube-api-access-7ngvm\") pod \"09f05756-63fb-4c1b-b763-065b4a66ceff\" (UID: \"09f05756-63fb-4c1b-b763-065b4a66ceff\") " Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.735809 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-client-ca" (OuterVolumeSpecName: "client-ca") pod "09f05756-63fb-4c1b-b763-065b4a66ceff" (UID: "09f05756-63fb-4c1b-b763-065b4a66ceff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.735928 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-config" (OuterVolumeSpecName: "config") pod "09f05756-63fb-4c1b-b763-065b4a66ceff" (UID: "09f05756-63fb-4c1b-b763-065b4a66ceff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.735857 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "09f05756-63fb-4c1b-b763-065b4a66ceff" (UID: "09f05756-63fb-4c1b-b763-065b4a66ceff"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.739455 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.739922 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09f05756-63fb-4c1b-b763-065b4a66ceff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09f05756-63fb-4c1b-b763-065b4a66ceff" (UID: "09f05756-63fb-4c1b-b763-065b4a66ceff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.739921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09f05756-63fb-4c1b-b763-065b4a66ceff-kube-api-access-7ngvm" (OuterVolumeSpecName: "kube-api-access-7ngvm") pod "09f05756-63fb-4c1b-b763-065b4a66ceff" (UID: "09f05756-63fb-4c1b-b763-065b4a66ceff"). InnerVolumeSpecName "kube-api-access-7ngvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.832731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p266t" event={"ID":"432a371d-d143-4da7-9332-682f52b39381","Type":"ContainerStarted","Data":"b9423a320aab48ad3d9b982af503bf9d93dbf652a3d81eb4d4c08e2b544a60d3"} Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.835807 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09f05756-63fb-4c1b-b763-065b4a66ceff-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.835831 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.835841 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ngvm\" (UniqueName: \"kubernetes.io/projected/09f05756-63fb-4c1b-b763-065b4a66ceff-kube-api-access-7ngvm\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.835849 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.835858 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09f05756-63fb-4c1b-b763-065b4a66ceff-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.836723 4795 generic.go:334] "Generic (PLEG): container finished" podID="09f05756-63fb-4c1b-b763-065b4a66ceff" containerID="60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5" exitCode=0 Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.836801 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" event={"ID":"09f05756-63fb-4c1b-b763-065b4a66ceff","Type":"ContainerDied","Data":"60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5"} Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.836832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" event={"ID":"09f05756-63fb-4c1b-b763-065b4a66ceff","Type":"ContainerDied","Data":"82e2340862c779da3d50db9a4a4c3d7bdeb32be3ace73f6532ae9bd0d9d449b3"} Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.836853 4795 scope.go:117] "RemoveContainer" containerID="60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.836928 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-658fd5994d-4zhjv" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.843116 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdtnr" event={"ID":"f457fe15-4099-4d77-8140-3297bee0a182","Type":"ContainerStarted","Data":"b2320d7447d9ae8b69ae359baad4c9c3633df43693ee76d985a2bbd216cfd589"} Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.873815 4795 scope.go:117] "RemoveContainer" containerID="60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5" Feb 19 21:34:15 crc kubenswrapper[4795]: E0219 21:34:15.882870 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5\": container with ID starting with 60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5 not found: ID does not exist" containerID="60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.882956 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5"} err="failed to get container status \"60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5\": rpc error: code = NotFound desc = could not find container \"60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5\": container with ID starting with 60f5807a8d146a89fab101c9137b8d07d3570d1b0c1032e786eccb821a4d18a5 not found: ID does not exist" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.885603 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-4zhjv"] Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.888395 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-4zhjv"] Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.908429 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r8t7x"] Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.946491 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4v92x"] Feb 19 21:34:15 crc kubenswrapper[4795]: E0219 21:34:15.946801 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f05756-63fb-4c1b-b763-065b4a66ceff" containerName="controller-manager" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.946824 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f05756-63fb-4c1b-b763-065b4a66ceff" containerName="controller-manager" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.947117 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="09f05756-63fb-4c1b-b763-065b4a66ceff" containerName="controller-manager" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.954990 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.961256 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 21:34:15 crc kubenswrapper[4795]: I0219 21:34:15.961481 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4v92x"] Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.039643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfbhm\" (UniqueName: \"kubernetes.io/projected/4002b94b-8679-454c-a721-fa900f6cde3b-kube-api-access-vfbhm\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.039685 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4002b94b-8679-454c-a721-fa900f6cde3b-catalog-content\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.039717 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4002b94b-8679-454c-a721-fa900f6cde3b-utilities\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.140875 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfbhm\" (UniqueName: \"kubernetes.io/projected/4002b94b-8679-454c-a721-fa900f6cde3b-kube-api-access-vfbhm\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.140918 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4002b94b-8679-454c-a721-fa900f6cde3b-catalog-content\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.140954 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4002b94b-8679-454c-a721-fa900f6cde3b-utilities\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.141389 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4002b94b-8679-454c-a721-fa900f6cde3b-utilities\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.141460 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4002b94b-8679-454c-a721-fa900f6cde3b-catalog-content\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.157436 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfbhm\" (UniqueName: \"kubernetes.io/projected/4002b94b-8679-454c-a721-fa900f6cde3b-kube-api-access-vfbhm\") pod \"redhat-operators-4v92x\" (UID: \"4002b94b-8679-454c-a721-fa900f6cde3b\") " pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.307031 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.519936 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4v92x"] Feb 19 21:34:16 crc kubenswrapper[4795]: W0219 21:34:16.530358 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4002b94b_8679_454c_a721_fa900f6cde3b.slice/crio-50fb6def2952fa87e8b69160dd8658191952e0924a74c50c7f54839084847fe3 WatchSource:0}: Error finding container 50fb6def2952fa87e8b69160dd8658191952e0924a74c50c7f54839084847fe3: Status 404 returned error can't find the container with id 50fb6def2952fa87e8b69160dd8658191952e0924a74c50c7f54839084847fe3 Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.559373 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" podUID="80407681-6091-46cc-836f-757ec4d16604" containerName="registry" containerID="cri-o://65bbbf8046489156a597ecad5046b6cb27a3c794bdbc97d6bd9820be41cf0ce9" gracePeriod=30 Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.663516 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75dccc5c74-kxcfb"] Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.664096 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.667111 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.667256 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.667392 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.667517 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.667575 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.667701 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.673055 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.680359 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75dccc5c74-kxcfb"] Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.757310 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-client-ca\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.757654 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-proxy-ca-bundles\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.757710 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmhsl\" (UniqueName: \"kubernetes.io/projected/f1eb72f6-8164-4492-886e-8a24ec7b56c3-kube-api-access-tmhsl\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.757768 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-config\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.757791 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1eb72f6-8164-4492-886e-8a24ec7b56c3-serving-cert\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.853003 4795 generic.go:334] "Generic (PLEG): container finished" podID="432a371d-d143-4da7-9332-682f52b39381" containerID="b9423a320aab48ad3d9b982af503bf9d93dbf652a3d81eb4d4c08e2b544a60d3" exitCode=0 Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.853069 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p266t" event={"ID":"432a371d-d143-4da7-9332-682f52b39381","Type":"ContainerDied","Data":"b9423a320aab48ad3d9b982af503bf9d93dbf652a3d81eb4d4c08e2b544a60d3"} Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.855721 4795 generic.go:334] "Generic (PLEG): container finished" podID="4002b94b-8679-454c-a721-fa900f6cde3b" containerID="ad79ddcd190a1eef44de3c7cc5a165efaf090d3940c4dfdd69cab1403f6866c2" exitCode=0 Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.856145 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4v92x" event={"ID":"4002b94b-8679-454c-a721-fa900f6cde3b","Type":"ContainerDied","Data":"ad79ddcd190a1eef44de3c7cc5a165efaf090d3940c4dfdd69cab1403f6866c2"} Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.856199 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4v92x" event={"ID":"4002b94b-8679-454c-a721-fa900f6cde3b","Type":"ContainerStarted","Data":"50fb6def2952fa87e8b69160dd8658191952e0924a74c50c7f54839084847fe3"} Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.858659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmhsl\" (UniqueName: \"kubernetes.io/projected/f1eb72f6-8164-4492-886e-8a24ec7b56c3-kube-api-access-tmhsl\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.858732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-config\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.858762 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1eb72f6-8164-4492-886e-8a24ec7b56c3-serving-cert\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.858798 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-client-ca\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.858835 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-proxy-ca-bundles\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.860469 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-client-ca\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.860728 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-proxy-ca-bundles\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.861817 4795 generic.go:334] "Generic (PLEG): container finished" podID="4941d783-94cd-4a5c-a124-5c8751cc8494" containerID="2a13a0790dc3bb7cf68dab93c75f7b9ee68b6d88b5baf91479232ddf7a7e83a0" exitCode=0 Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.861881 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r8t7x" event={"ID":"4941d783-94cd-4a5c-a124-5c8751cc8494","Type":"ContainerDied","Data":"2a13a0790dc3bb7cf68dab93c75f7b9ee68b6d88b5baf91479232ddf7a7e83a0"} Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.861905 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r8t7x" event={"ID":"4941d783-94cd-4a5c-a124-5c8751cc8494","Type":"ContainerStarted","Data":"00bcdf3193d28659cbe747a71cf7bf86337c90e5f777d20f680713f4490a05c7"} Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.863109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1eb72f6-8164-4492-886e-8a24ec7b56c3-config\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.874369 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1eb72f6-8164-4492-886e-8a24ec7b56c3-serving-cert\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.892390 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmhsl\" (UniqueName: \"kubernetes.io/projected/f1eb72f6-8164-4492-886e-8a24ec7b56c3-kube-api-access-tmhsl\") pod \"controller-manager-75dccc5c74-kxcfb\" (UID: \"f1eb72f6-8164-4492-886e-8a24ec7b56c3\") " pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.894525 4795 generic.go:334] "Generic (PLEG): container finished" podID="80407681-6091-46cc-836f-757ec4d16604" containerID="65bbbf8046489156a597ecad5046b6cb27a3c794bdbc97d6bd9820be41cf0ce9" exitCode=0 Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.894585 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" event={"ID":"80407681-6091-46cc-836f-757ec4d16604","Type":"ContainerDied","Data":"65bbbf8046489156a597ecad5046b6cb27a3c794bdbc97d6bd9820be41cf0ce9"} Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.907895 4795 generic.go:334] "Generic (PLEG): container finished" podID="f457fe15-4099-4d77-8140-3297bee0a182" containerID="b2320d7447d9ae8b69ae359baad4c9c3633df43693ee76d985a2bbd216cfd589" exitCode=0 Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.907938 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdtnr" event={"ID":"f457fe15-4099-4d77-8140-3297bee0a182","Type":"ContainerDied","Data":"b2320d7447d9ae8b69ae359baad4c9c3633df43693ee76d985a2bbd216cfd589"} Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.907965 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdtnr" event={"ID":"f457fe15-4099-4d77-8140-3297bee0a182","Type":"ContainerStarted","Data":"f938be8ab635ea8a8d765d8b19d4b4e613931d75d16f156e55c63f345b513f51"} Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.947606 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mdtnr" podStartSLOduration=2.553936162 podStartE2EDuration="3.947591267s" podCreationTimestamp="2026-02-19 21:34:13 +0000 UTC" firstStartedPulling="2026-02-19 21:34:14.82135056 +0000 UTC m=+366.013868424" lastFinishedPulling="2026-02-19 21:34:16.215005665 +0000 UTC m=+367.407523529" observedRunningTime="2026-02-19 21:34:16.944706559 +0000 UTC m=+368.137224423" watchObservedRunningTime="2026-02-19 21:34:16.947591267 +0000 UTC m=+368.140109131" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.949143 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.959484 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4kqs\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-kube-api-access-b4kqs\") pod \"80407681-6091-46cc-836f-757ec4d16604\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.959542 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80407681-6091-46cc-836f-757ec4d16604-installation-pull-secrets\") pod \"80407681-6091-46cc-836f-757ec4d16604\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.959577 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80407681-6091-46cc-836f-757ec4d16604-ca-trust-extracted\") pod \"80407681-6091-46cc-836f-757ec4d16604\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.959638 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-trusted-ca\") pod \"80407681-6091-46cc-836f-757ec4d16604\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.959661 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-registry-tls\") pod \"80407681-6091-46cc-836f-757ec4d16604\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.959685 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-registry-certificates\") pod \"80407681-6091-46cc-836f-757ec4d16604\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.959813 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"80407681-6091-46cc-836f-757ec4d16604\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.959871 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-bound-sa-token\") pod \"80407681-6091-46cc-836f-757ec4d16604\" (UID: \"80407681-6091-46cc-836f-757ec4d16604\") " Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.960949 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "80407681-6091-46cc-836f-757ec4d16604" (UID: "80407681-6091-46cc-836f-757ec4d16604"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.961409 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "80407681-6091-46cc-836f-757ec4d16604" (UID: "80407681-6091-46cc-836f-757ec4d16604"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.963307 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-kube-api-access-b4kqs" (OuterVolumeSpecName: "kube-api-access-b4kqs") pod "80407681-6091-46cc-836f-757ec4d16604" (UID: "80407681-6091-46cc-836f-757ec4d16604"). InnerVolumeSpecName "kube-api-access-b4kqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.963629 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "80407681-6091-46cc-836f-757ec4d16604" (UID: "80407681-6091-46cc-836f-757ec4d16604"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.964363 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80407681-6091-46cc-836f-757ec4d16604-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "80407681-6091-46cc-836f-757ec4d16604" (UID: "80407681-6091-46cc-836f-757ec4d16604"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.964666 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "80407681-6091-46cc-836f-757ec4d16604" (UID: "80407681-6091-46cc-836f-757ec4d16604"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.977978 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "80407681-6091-46cc-836f-757ec4d16604" (UID: "80407681-6091-46cc-836f-757ec4d16604"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 21:34:16 crc kubenswrapper[4795]: I0219 21:34:16.978431 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80407681-6091-46cc-836f-757ec4d16604-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "80407681-6091-46cc-836f-757ec4d16604" (UID: "80407681-6091-46cc-836f-757ec4d16604"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.023810 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.061637 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4kqs\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-kube-api-access-b4kqs\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.061686 4795 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80407681-6091-46cc-836f-757ec4d16604-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.061698 4795 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80407681-6091-46cc-836f-757ec4d16604-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.061707 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.061716 4795 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80407681-6091-46cc-836f-757ec4d16604-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.061724 4795 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.061732 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80407681-6091-46cc-836f-757ec4d16604-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.217810 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75dccc5c74-kxcfb"] Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.520828 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09f05756-63fb-4c1b-b763-065b4a66ceff" path="/var/lib/kubelet/pods/09f05756-63fb-4c1b-b763-065b4a66ceff/volumes" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.918228 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p266t" event={"ID":"432a371d-d143-4da7-9332-682f52b39381","Type":"ContainerStarted","Data":"b211347bda79d6334829206bacffd181e7103ae2551df554c80058c66927561a"} Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.919674 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4v92x" event={"ID":"4002b94b-8679-454c-a721-fa900f6cde3b","Type":"ContainerStarted","Data":"c68f15a983051f4f99ec5ef6d4ff27d4f6d9bb5925061af129100d919e54372c"} Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.921861 4795 generic.go:334] "Generic (PLEG): container finished" podID="4941d783-94cd-4a5c-a124-5c8751cc8494" containerID="527326d97e5ae95c589541ded8c2267bf0cc084a5d6d6e6c0e689826ef089c37" exitCode=0 Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.921913 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r8t7x" event={"ID":"4941d783-94cd-4a5c-a124-5c8751cc8494","Type":"ContainerDied","Data":"527326d97e5ae95c589541ded8c2267bf0cc084a5d6d6e6c0e689826ef089c37"} Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.927407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" event={"ID":"f1eb72f6-8164-4492-886e-8a24ec7b56c3","Type":"ContainerStarted","Data":"29d76f92b63ad5874a47fa0c6423d915bc59af679709ea0cc500dc141e825163"} Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.927440 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" event={"ID":"f1eb72f6-8164-4492-886e-8a24ec7b56c3","Type":"ContainerStarted","Data":"a3658b2c4a0a12c86cabc8ed73e4998562ac8c4f155e33a413875640448d2b73"} Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.927453 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.932661 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" event={"ID":"80407681-6091-46cc-836f-757ec4d16604","Type":"ContainerDied","Data":"f75c3a05080941863a756e5365daccf1f7896cc60502e7fce756c537524233eb"} Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.932713 4795 scope.go:117] "RemoveContainer" containerID="65bbbf8046489156a597ecad5046b6cb27a3c794bdbc97d6bd9820be41cf0ce9" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.932825 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4h49m" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.933129 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.940109 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p266t" podStartSLOduration=2.497806347 podStartE2EDuration="4.940091867s" podCreationTimestamp="2026-02-19 21:34:13 +0000 UTC" firstStartedPulling="2026-02-19 21:34:14.82432737 +0000 UTC m=+366.016845234" lastFinishedPulling="2026-02-19 21:34:17.26661289 +0000 UTC m=+368.459130754" observedRunningTime="2026-02-19 21:34:17.936515321 +0000 UTC m=+369.129033185" watchObservedRunningTime="2026-02-19 21:34:17.940091867 +0000 UTC m=+369.132609731" Feb 19 21:34:17 crc kubenswrapper[4795]: I0219 21:34:17.995891 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75dccc5c74-kxcfb" podStartSLOduration=2.995878373 podStartE2EDuration="2.995878373s" podCreationTimestamp="2026-02-19 21:34:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:34:17.995117153 +0000 UTC m=+369.187635017" watchObservedRunningTime="2026-02-19 21:34:17.995878373 +0000 UTC m=+369.188396237" Feb 19 21:34:18 crc kubenswrapper[4795]: I0219 21:34:18.044876 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4h49m"] Feb 19 21:34:18 crc kubenswrapper[4795]: I0219 21:34:18.051417 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4h49m"] Feb 19 21:34:18 crc kubenswrapper[4795]: I0219 21:34:18.938479 4795 generic.go:334] "Generic (PLEG): container finished" podID="4002b94b-8679-454c-a721-fa900f6cde3b" containerID="c68f15a983051f4f99ec5ef6d4ff27d4f6d9bb5925061af129100d919e54372c" exitCode=0 Feb 19 21:34:18 crc kubenswrapper[4795]: I0219 21:34:18.938555 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4v92x" event={"ID":"4002b94b-8679-454c-a721-fa900f6cde3b","Type":"ContainerDied","Data":"c68f15a983051f4f99ec5ef6d4ff27d4f6d9bb5925061af129100d919e54372c"} Feb 19 21:34:18 crc kubenswrapper[4795]: I0219 21:34:18.940354 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r8t7x" event={"ID":"4941d783-94cd-4a5c-a124-5c8751cc8494","Type":"ContainerStarted","Data":"cd0f881418c39c61d538a924534edeccfb4228f8b26bb2f170aeb97e73127009"} Feb 19 21:34:18 crc kubenswrapper[4795]: I0219 21:34:18.975754 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r8t7x" podStartSLOduration=2.536868626 podStartE2EDuration="3.975733785s" podCreationTimestamp="2026-02-19 21:34:15 +0000 UTC" firstStartedPulling="2026-02-19 21:34:16.864248612 +0000 UTC m=+368.056766476" lastFinishedPulling="2026-02-19 21:34:18.303113771 +0000 UTC m=+369.495631635" observedRunningTime="2026-02-19 21:34:18.974554623 +0000 UTC m=+370.167072497" watchObservedRunningTime="2026-02-19 21:34:18.975733785 +0000 UTC m=+370.168251649" Feb 19 21:34:19 crc kubenswrapper[4795]: I0219 21:34:19.523132 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80407681-6091-46cc-836f-757ec4d16604" path="/var/lib/kubelet/pods/80407681-6091-46cc-836f-757ec4d16604/volumes" Feb 19 21:34:19 crc kubenswrapper[4795]: I0219 21:34:19.948681 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4v92x" event={"ID":"4002b94b-8679-454c-a721-fa900f6cde3b","Type":"ContainerStarted","Data":"cd522c11f1f693280a76f8e450998fe73c235352edcb3197b989bee12ba9092b"} Feb 19 21:34:19 crc kubenswrapper[4795]: I0219 21:34:19.965617 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4v92x" podStartSLOduration=2.499528385 podStartE2EDuration="4.965592153s" podCreationTimestamp="2026-02-19 21:34:15 +0000 UTC" firstStartedPulling="2026-02-19 21:34:16.856746621 +0000 UTC m=+368.049264485" lastFinishedPulling="2026-02-19 21:34:19.322810349 +0000 UTC m=+370.515328253" observedRunningTime="2026-02-19 21:34:19.96398849 +0000 UTC m=+371.156506354" watchObservedRunningTime="2026-02-19 21:34:19.965592153 +0000 UTC m=+371.158110057" Feb 19 21:34:23 crc kubenswrapper[4795]: I0219 21:34:23.470580 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:23 crc kubenswrapper[4795]: I0219 21:34:23.471105 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:23 crc kubenswrapper[4795]: I0219 21:34:23.509397 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:23 crc kubenswrapper[4795]: I0219 21:34:23.707299 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:23 crc kubenswrapper[4795]: I0219 21:34:23.707374 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:23 crc kubenswrapper[4795]: I0219 21:34:23.743261 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:24 crc kubenswrapper[4795]: I0219 21:34:24.007984 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mdtnr" Feb 19 21:34:24 crc kubenswrapper[4795]: I0219 21:34:24.009944 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p266t" Feb 19 21:34:25 crc kubenswrapper[4795]: I0219 21:34:25.740216 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:25 crc kubenswrapper[4795]: I0219 21:34:25.740547 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:25 crc kubenswrapper[4795]: I0219 21:34:25.801535 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:26 crc kubenswrapper[4795]: I0219 21:34:26.013159 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r8t7x" Feb 19 21:34:26 crc kubenswrapper[4795]: I0219 21:34:26.308321 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:26 crc kubenswrapper[4795]: I0219 21:34:26.308420 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:26 crc kubenswrapper[4795]: I0219 21:34:26.345544 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:27 crc kubenswrapper[4795]: I0219 21:34:27.047614 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4v92x" Feb 19 21:34:28 crc kubenswrapper[4795]: I0219 21:34:28.427608 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:34:28 crc kubenswrapper[4795]: I0219 21:34:28.427687 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:34:58 crc kubenswrapper[4795]: I0219 21:34:58.427568 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:34:58 crc kubenswrapper[4795]: I0219 21:34:58.428241 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:34:58 crc kubenswrapper[4795]: I0219 21:34:58.428294 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:34:58 crc kubenswrapper[4795]: I0219 21:34:58.429704 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b70534b9f8ebcb8ef865c023146a47e5407cdcbaee4d6cb7a41e8ac0daedef4a"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:34:58 crc kubenswrapper[4795]: I0219 21:34:58.429781 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://b70534b9f8ebcb8ef865c023146a47e5407cdcbaee4d6cb7a41e8ac0daedef4a" gracePeriod=600 Feb 19 21:34:59 crc kubenswrapper[4795]: I0219 21:34:59.177057 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="b70534b9f8ebcb8ef865c023146a47e5407cdcbaee4d6cb7a41e8ac0daedef4a" exitCode=0 Feb 19 21:34:59 crc kubenswrapper[4795]: I0219 21:34:59.177128 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"b70534b9f8ebcb8ef865c023146a47e5407cdcbaee4d6cb7a41e8ac0daedef4a"} Feb 19 21:34:59 crc kubenswrapper[4795]: I0219 21:34:59.177689 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"7b9a6596619327bd9c2f9e2c7e01d53fcd036e297997e2604d1242871dfda04a"} Feb 19 21:34:59 crc kubenswrapper[4795]: I0219 21:34:59.177714 4795 scope.go:117] "RemoveContainer" containerID="6bbc1f90994253aba517d1de97087dbd617934343fdb8389588863bf1305b72e" Feb 19 21:36:58 crc kubenswrapper[4795]: I0219 21:36:58.427990 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:36:58 crc kubenswrapper[4795]: I0219 21:36:58.429618 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:37:09 crc kubenswrapper[4795]: I0219 21:37:09.759274 4795 scope.go:117] "RemoveContainer" containerID="4275386b7725c8ec761b57381bafb0a0755373bf551591680c5fa169d19954f2" Feb 19 21:37:09 crc kubenswrapper[4795]: I0219 21:37:09.781032 4795 scope.go:117] "RemoveContainer" containerID="2bc16809b25a9de1b1eb527dd20e661703d80bd6e75ac128fd97a218812a8320" Feb 19 21:37:09 crc kubenswrapper[4795]: I0219 21:37:09.805694 4795 scope.go:117] "RemoveContainer" containerID="843a490c0d5b242211efc7cdd05db863b65b298c17a2a1bc1bcd8caa9584d3d3" Feb 19 21:37:09 crc kubenswrapper[4795]: I0219 21:37:09.828830 4795 scope.go:117] "RemoveContainer" containerID="099f13240cf3aea087d163d3e1052bf3c9c7278f2cd6182ccc06266ec67326ae" Feb 19 21:37:28 crc kubenswrapper[4795]: I0219 21:37:28.428346 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:37:28 crc kubenswrapper[4795]: I0219 21:37:28.429118 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:37:58 crc kubenswrapper[4795]: I0219 21:37:58.427588 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:37:58 crc kubenswrapper[4795]: I0219 21:37:58.428303 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:37:58 crc kubenswrapper[4795]: I0219 21:37:58.428375 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:37:58 crc kubenswrapper[4795]: I0219 21:37:58.429596 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b9a6596619327bd9c2f9e2c7e01d53fcd036e297997e2604d1242871dfda04a"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:37:58 crc kubenswrapper[4795]: I0219 21:37:58.429698 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://7b9a6596619327bd9c2f9e2c7e01d53fcd036e297997e2604d1242871dfda04a" gracePeriod=600 Feb 19 21:37:59 crc kubenswrapper[4795]: I0219 21:37:59.345865 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="7b9a6596619327bd9c2f9e2c7e01d53fcd036e297997e2604d1242871dfda04a" exitCode=0 Feb 19 21:37:59 crc kubenswrapper[4795]: I0219 21:37:59.345953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"7b9a6596619327bd9c2f9e2c7e01d53fcd036e297997e2604d1242871dfda04a"} Feb 19 21:37:59 crc kubenswrapper[4795]: I0219 21:37:59.346478 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"01e410588eeb6332f2520524efa20c5c33620bd277e99c49543b745d9bb370ca"} Feb 19 21:37:59 crc kubenswrapper[4795]: I0219 21:37:59.346523 4795 scope.go:117] "RemoveContainer" containerID="b70534b9f8ebcb8ef865c023146a47e5407cdcbaee4d6cb7a41e8ac0daedef4a" Feb 19 21:38:09 crc kubenswrapper[4795]: I0219 21:38:09.860664 4795 scope.go:117] "RemoveContainer" containerID="e2d47055ec62ef6c48d9090ad6f5104fd8d79584ae42ab1689cf9d640447aeec" Feb 19 21:38:09 crc kubenswrapper[4795]: I0219 21:38:09.879940 4795 scope.go:117] "RemoveContainer" containerID="e2b3b578e0a130165bd461f501cf99b487cf4f990927f2541dd89aab2055e28d" Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.839725 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4qphl"] Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.840741 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovn-controller" containerID="cri-o://83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb" gracePeriod=30 Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.840785 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="nbdb" containerID="cri-o://b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab" gracePeriod=30 Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.840866 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="northd" containerID="cri-o://c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29" gracePeriod=30 Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.840916 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a" gracePeriod=30 Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.840935 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="sbdb" containerID="cri-o://d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00" gracePeriod=30 Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.840961 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kube-rbac-proxy-node" containerID="cri-o://c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475" gracePeriod=30 Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.841006 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovn-acl-logging" containerID="cri-o://10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9" gracePeriod=30 Feb 19 21:39:15 crc kubenswrapper[4795]: I0219 21:39:15.876558 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" containerID="cri-o://3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81" gracePeriod=30 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.121255 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/3.log" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.123440 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovn-acl-logging/0.log" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.123921 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovn-controller/0.log" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.124536 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173046 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pf8fb"] Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173309 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovn-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173327 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovn-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173344 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173352 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173365 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="nbdb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173373 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="nbdb" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173386 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kubecfg-setup" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173393 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kubecfg-setup" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173401 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173408 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173417 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kube-rbac-proxy-node" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173424 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kube-rbac-proxy-node" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173433 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173441 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173450 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173458 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173469 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80407681-6091-46cc-836f-757ec4d16604" containerName="registry" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173475 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="80407681-6091-46cc-836f-757ec4d16604" containerName="registry" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173485 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173491 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173501 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="sbdb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173508 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="sbdb" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173518 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173527 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173537 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovn-acl-logging" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173545 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovn-acl-logging" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.173555 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="northd" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173562 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="northd" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173664 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173675 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="80407681-6091-46cc-836f-757ec4d16604" containerName="registry" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173685 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173692 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173703 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="northd" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173713 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173724 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="kube-rbac-proxy-node" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173732 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="nbdb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173741 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovn-acl-logging" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173752 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="sbdb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173762 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovn-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173950 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.173960 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerName="ovnkube-controller" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.175735 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239295 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-ovn-kubernetes\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239670 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nrjb\" (UniqueName: \"kubernetes.io/projected/adf5bd36-b46b-4a06-8291-cae9f3988330-kube-api-access-6nrjb\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239691 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-slash\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239708 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-systemd\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239728 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-bin\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239747 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-var-lib-openvswitch\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239404 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239774 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-slash" (OuterVolumeSpecName: "host-slash") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239811 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239822 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239825 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239769 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-netns\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239945 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-env-overrides\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.239972 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-ovn\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240000 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-config\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240033 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-var-lib-cni-networks-ovn-kubernetes\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240059 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-log-socket\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240062 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240089 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-systemd-units\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240127 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-script-lib\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240148 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-etc-openvswitch\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240118 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240147 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240186 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-netd\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240205 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240210 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/adf5bd36-b46b-4a06-8291-cae9f3988330-ovn-node-metrics-cert\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240226 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240232 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-openvswitch\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240257 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-node-log\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240280 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-kubelet\") pod \"adf5bd36-b46b-4a06-8291-cae9f3988330\" (UID: \"adf5bd36-b46b-4a06-8291-cae9f3988330\") " Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240280 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-log-socket" (OuterVolumeSpecName: "log-socket") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240336 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240365 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-node-log" (OuterVolumeSpecName: "node-log") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240432 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-run-netns\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-var-lib-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240458 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240531 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-cni-bin\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240564 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-kubelet\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240593 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24d72341-090a-4e01-bc3f-9e04becb3500-ovn-node-metrics-cert\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240617 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-node-log\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240659 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-ovnkube-script-lib\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240786 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240778 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-cni-netd\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240866 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-slash\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240904 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-env-overrides\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240908 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240930 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240957 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv45v\" (UniqueName: \"kubernetes.io/projected/24d72341-090a-4e01-bc3f-9e04becb3500-kube-api-access-zv45v\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240980 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-run-ovn-kubernetes\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.240998 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-log-socket\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241014 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-ovnkube-config\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241096 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241148 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-systemd-units\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241206 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-systemd\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241303 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-ovn\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241493 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-etc-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241647 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241678 4795 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241697 4795 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241714 4795 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241731 4795 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-node-log\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241748 4795 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241766 4795 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241786 4795 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-slash\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241803 4795 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241820 4795 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241836 4795 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241852 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241869 4795 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241887 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/adf5bd36-b46b-4a06-8291-cae9f3988330-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241904 4795 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241920 4795 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-log-socket\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.241937 4795 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.245704 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf5bd36-b46b-4a06-8291-cae9f3988330-kube-api-access-6nrjb" (OuterVolumeSpecName: "kube-api-access-6nrjb") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "kube-api-access-6nrjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.246284 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf5bd36-b46b-4a06-8291-cae9f3988330-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.255675 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "adf5bd36-b46b-4a06-8291-cae9f3988330" (UID: "adf5bd36-b46b-4a06-8291-cae9f3988330"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343560 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-ovn\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343641 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343684 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-ovn\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-etc-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343685 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-etc-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343756 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343792 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-run-netns\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-var-lib-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343840 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-cni-bin\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343861 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-kubelet\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343876 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24d72341-090a-4e01-bc3f-9e04becb3500-ovn-node-metrics-cert\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343891 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-node-log\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-ovnkube-script-lib\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343937 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-cni-netd\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343949 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-run-netns\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-slash\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343998 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-node-log\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344034 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-env-overrides\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344080 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv45v\" (UniqueName: \"kubernetes.io/projected/24d72341-090a-4e01-bc3f-9e04becb3500-kube-api-access-zv45v\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344120 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-run-ovn-kubernetes\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344152 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-ovnkube-config\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344242 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-log-socket\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344283 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344313 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-systemd-units\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344339 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-systemd\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344396 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nrjb\" (UniqueName: \"kubernetes.io/projected/adf5bd36-b46b-4a06-8291-cae9f3988330-kube-api-access-6nrjb\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344416 4795 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/adf5bd36-b46b-4a06-8291-cae9f3988330-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344438 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/adf5bd36-b46b-4a06-8291-cae9f3988330-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344484 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-systemd\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344527 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-cni-bin\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344571 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-kubelet\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344722 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-ovnkube-script-lib\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.343917 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-var-lib-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344770 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-cni-netd\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.344793 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-slash\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.345194 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-ovnkube-config\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.345347 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-run-openvswitch\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.345413 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-log-socket\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.345449 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-systemd-units\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.345472 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24d72341-090a-4e01-bc3f-9e04becb3500-env-overrides\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.345502 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24d72341-090a-4e01-bc3f-9e04becb3500-host-run-ovn-kubernetes\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.348555 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24d72341-090a-4e01-bc3f-9e04becb3500-ovn-node-metrics-cert\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.362275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv45v\" (UniqueName: \"kubernetes.io/projected/24d72341-090a-4e01-bc3f-9e04becb3500-kube-api-access-zv45v\") pod \"ovnkube-node-pf8fb\" (UID: \"24d72341-090a-4e01-bc3f-9e04becb3500\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.490832 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.792779 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/2.log" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.793549 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/1.log" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.793638 4795 generic.go:334] "Generic (PLEG): container finished" podID="e967392b-9bd8-4111-b1b9-96d503a19668" containerID="2822f57eb80a87e2600f1bdd3a3189e4bded6fbc0e489210ee08ea133cbf2aa9" exitCode=2 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.793725 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5p6d9" event={"ID":"e967392b-9bd8-4111-b1b9-96d503a19668","Type":"ContainerDied","Data":"2822f57eb80a87e2600f1bdd3a3189e4bded6fbc0e489210ee08ea133cbf2aa9"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.793806 4795 scope.go:117] "RemoveContainer" containerID="02e3717583e910ca7356ee58e22d63f40252488fb8540be7a30ab0fc918b908b" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.794328 4795 scope.go:117] "RemoveContainer" containerID="2822f57eb80a87e2600f1bdd3a3189e4bded6fbc0e489210ee08ea133cbf2aa9" Feb 19 21:39:16 crc kubenswrapper[4795]: E0219 21:39:16.794592 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-5p6d9_openshift-multus(e967392b-9bd8-4111-b1b9-96d503a19668)\"" pod="openshift-multus/multus-5p6d9" podUID="e967392b-9bd8-4111-b1b9-96d503a19668" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.799511 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovnkube-controller/3.log" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.801564 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovn-acl-logging/0.log" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.801972 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qphl_adf5bd36-b46b-4a06-8291-cae9f3988330/ovn-controller/0.log" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802520 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81" exitCode=0 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802602 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00" exitCode=0 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802658 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab" exitCode=0 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802706 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29" exitCode=0 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802774 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a" exitCode=0 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802827 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475" exitCode=0 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802882 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9" exitCode=143 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802930 4795 generic.go:334] "Generic (PLEG): container finished" podID="adf5bd36-b46b-4a06-8291-cae9f3988330" containerID="83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb" exitCode=143 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802826 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803120 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803260 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803277 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803295 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803304 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803314 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803324 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803334 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803344 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803354 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803364 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803373 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803388 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803404 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803416 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803425 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803435 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803445 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803453 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803462 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803471 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803480 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803488 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803503 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803520 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803532 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803541 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803548 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803555 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803565 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803571 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803578 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803585 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803591 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803601 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" event={"ID":"adf5bd36-b46b-4a06-8291-cae9f3988330","Type":"ContainerDied","Data":"740cf96667070bca195c02d1eaa75a11754b5d3aaf7c73fedd2e4e883f6a4193"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803614 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803623 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803631 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803638 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803645 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803652 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803659 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803666 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803673 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.803680 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.802745 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4qphl" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.805277 4795 generic.go:334] "Generic (PLEG): container finished" podID="24d72341-090a-4e01-bc3f-9e04becb3500" containerID="193aac3fcec4da276a7aec22b1427ad4f66933bf38a55fc8aeeb45f9b5a11fd4" exitCode=0 Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.805369 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerDied","Data":"193aac3fcec4da276a7aec22b1427ad4f66933bf38a55fc8aeeb45f9b5a11fd4"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.805442 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"6c92282b9b9b80b829143d77ba70c574b5e19c9e3fe2b59c01addfa468227c9d"} Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.835181 4795 scope.go:117] "RemoveContainer" containerID="3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.875177 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4qphl"] Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.879263 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.881674 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4qphl"] Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.917810 4795 scope.go:117] "RemoveContainer" containerID="d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.930131 4795 scope.go:117] "RemoveContainer" containerID="b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.943233 4795 scope.go:117] "RemoveContainer" containerID="c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.958404 4795 scope.go:117] "RemoveContainer" containerID="05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a" Feb 19 21:39:16 crc kubenswrapper[4795]: I0219 21:39:16.988400 4795 scope.go:117] "RemoveContainer" containerID="c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.012245 4795 scope.go:117] "RemoveContainer" containerID="10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.038804 4795 scope.go:117] "RemoveContainer" containerID="83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.054381 4795 scope.go:117] "RemoveContainer" containerID="cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.080714 4795 scope.go:117] "RemoveContainer" containerID="3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.081104 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": container with ID starting with 3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81 not found: ID does not exist" containerID="3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.081136 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} err="failed to get container status \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": rpc error: code = NotFound desc = could not find container \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": container with ID starting with 3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.081159 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.081682 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": container with ID starting with b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d not found: ID does not exist" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.081702 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} err="failed to get container status \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": rpc error: code = NotFound desc = could not find container \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": container with ID starting with b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.081714 4795 scope.go:117] "RemoveContainer" containerID="d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.081997 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": container with ID starting with d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00 not found: ID does not exist" containerID="d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.082054 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} err="failed to get container status \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": rpc error: code = NotFound desc = could not find container \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": container with ID starting with d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.082069 4795 scope.go:117] "RemoveContainer" containerID="b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.082410 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": container with ID starting with b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab not found: ID does not exist" containerID="b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.082445 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} err="failed to get container status \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": rpc error: code = NotFound desc = could not find container \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": container with ID starting with b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.082463 4795 scope.go:117] "RemoveContainer" containerID="c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.082760 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": container with ID starting with c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29 not found: ID does not exist" containerID="c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.082781 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} err="failed to get container status \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": rpc error: code = NotFound desc = could not find container \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": container with ID starting with c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.082793 4795 scope.go:117] "RemoveContainer" containerID="05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.083179 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": container with ID starting with 05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a not found: ID does not exist" containerID="05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.083208 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} err="failed to get container status \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": rpc error: code = NotFound desc = could not find container \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": container with ID starting with 05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.083228 4795 scope.go:117] "RemoveContainer" containerID="c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.083472 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": container with ID starting with c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475 not found: ID does not exist" containerID="c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.083494 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} err="failed to get container status \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": rpc error: code = NotFound desc = could not find container \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": container with ID starting with c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.083507 4795 scope.go:117] "RemoveContainer" containerID="10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.083813 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": container with ID starting with 10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9 not found: ID does not exist" containerID="10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.083833 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} err="failed to get container status \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": rpc error: code = NotFound desc = could not find container \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": container with ID starting with 10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.083845 4795 scope.go:117] "RemoveContainer" containerID="83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.084052 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": container with ID starting with 83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb not found: ID does not exist" containerID="83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.084071 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} err="failed to get container status \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": rpc error: code = NotFound desc = could not find container \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": container with ID starting with 83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.084083 4795 scope.go:117] "RemoveContainer" containerID="cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c" Feb 19 21:39:17 crc kubenswrapper[4795]: E0219 21:39:17.084351 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": container with ID starting with cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c not found: ID does not exist" containerID="cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.084379 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} err="failed to get container status \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": rpc error: code = NotFound desc = could not find container \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": container with ID starting with cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.084397 4795 scope.go:117] "RemoveContainer" containerID="3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.084826 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} err="failed to get container status \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": rpc error: code = NotFound desc = could not find container \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": container with ID starting with 3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.084843 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.085048 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} err="failed to get container status \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": rpc error: code = NotFound desc = could not find container \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": container with ID starting with b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.085064 4795 scope.go:117] "RemoveContainer" containerID="d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.085746 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} err="failed to get container status \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": rpc error: code = NotFound desc = could not find container \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": container with ID starting with d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.085770 4795 scope.go:117] "RemoveContainer" containerID="b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.086059 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} err="failed to get container status \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": rpc error: code = NotFound desc = could not find container \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": container with ID starting with b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.086086 4795 scope.go:117] "RemoveContainer" containerID="c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.086476 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} err="failed to get container status \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": rpc error: code = NotFound desc = could not find container \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": container with ID starting with c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.086519 4795 scope.go:117] "RemoveContainer" containerID="05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.086768 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} err="failed to get container status \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": rpc error: code = NotFound desc = could not find container \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": container with ID starting with 05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.086811 4795 scope.go:117] "RemoveContainer" containerID="c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.087349 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} err="failed to get container status \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": rpc error: code = NotFound desc = could not find container \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": container with ID starting with c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.087372 4795 scope.go:117] "RemoveContainer" containerID="10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.087710 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} err="failed to get container status \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": rpc error: code = NotFound desc = could not find container \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": container with ID starting with 10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.087736 4795 scope.go:117] "RemoveContainer" containerID="83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.087964 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} err="failed to get container status \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": rpc error: code = NotFound desc = could not find container \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": container with ID starting with 83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.088010 4795 scope.go:117] "RemoveContainer" containerID="cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.088332 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} err="failed to get container status \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": rpc error: code = NotFound desc = could not find container \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": container with ID starting with cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.088374 4795 scope.go:117] "RemoveContainer" containerID="3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.088746 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} err="failed to get container status \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": rpc error: code = NotFound desc = could not find container \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": container with ID starting with 3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.088794 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.089072 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} err="failed to get container status \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": rpc error: code = NotFound desc = could not find container \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": container with ID starting with b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.089093 4795 scope.go:117] "RemoveContainer" containerID="d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.089310 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} err="failed to get container status \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": rpc error: code = NotFound desc = could not find container \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": container with ID starting with d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.089353 4795 scope.go:117] "RemoveContainer" containerID="b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.089597 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} err="failed to get container status \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": rpc error: code = NotFound desc = could not find container \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": container with ID starting with b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.089624 4795 scope.go:117] "RemoveContainer" containerID="c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.089842 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} err="failed to get container status \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": rpc error: code = NotFound desc = could not find container \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": container with ID starting with c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.089868 4795 scope.go:117] "RemoveContainer" containerID="05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090070 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} err="failed to get container status \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": rpc error: code = NotFound desc = could not find container \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": container with ID starting with 05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090097 4795 scope.go:117] "RemoveContainer" containerID="c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090349 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} err="failed to get container status \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": rpc error: code = NotFound desc = could not find container \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": container with ID starting with c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090368 4795 scope.go:117] "RemoveContainer" containerID="10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090569 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} err="failed to get container status \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": rpc error: code = NotFound desc = could not find container \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": container with ID starting with 10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090586 4795 scope.go:117] "RemoveContainer" containerID="83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090774 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} err="failed to get container status \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": rpc error: code = NotFound desc = could not find container \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": container with ID starting with 83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090791 4795 scope.go:117] "RemoveContainer" containerID="cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.090994 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} err="failed to get container status \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": rpc error: code = NotFound desc = could not find container \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": container with ID starting with cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.091012 4795 scope.go:117] "RemoveContainer" containerID="3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.091499 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81"} err="failed to get container status \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": rpc error: code = NotFound desc = could not find container \"3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81\": container with ID starting with 3a55f14e05508c6035437477613aca541a57e877c0402dec1f23f8bee280ec81 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.091584 4795 scope.go:117] "RemoveContainer" containerID="b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.091978 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d"} err="failed to get container status \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": rpc error: code = NotFound desc = could not find container \"b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d\": container with ID starting with b9402ff4c6ab0b8badfb1225bc300bb72fd6d970a104a3fe4487ad7c152caf8d not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.092002 4795 scope.go:117] "RemoveContainer" containerID="d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.092307 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00"} err="failed to get container status \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": rpc error: code = NotFound desc = could not find container \"d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00\": container with ID starting with d0583a3c7afb6220ac3ad5793fa89fee022c30bdedb32b22e07c119838cdfe00 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.092332 4795 scope.go:117] "RemoveContainer" containerID="b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.092609 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab"} err="failed to get container status \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": rpc error: code = NotFound desc = could not find container \"b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab\": container with ID starting with b061fc9e94d9575b87c7ad323ee680c9ff09f18c8bc88678d05c866d18cb5fab not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.092628 4795 scope.go:117] "RemoveContainer" containerID="c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.092980 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29"} err="failed to get container status \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": rpc error: code = NotFound desc = could not find container \"c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29\": container with ID starting with c60236a21e4b811dc1400ac6f7ab47eddef73b8368cbd1be6a33c01c5afbaf29 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093002 4795 scope.go:117] "RemoveContainer" containerID="05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093251 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a"} err="failed to get container status \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": rpc error: code = NotFound desc = could not find container \"05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a\": container with ID starting with 05c75520b389b387065dd5eb31e06dcfe9541fd9922193ce1722b24ac36e7c3a not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093275 4795 scope.go:117] "RemoveContainer" containerID="c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093472 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475"} err="failed to get container status \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": rpc error: code = NotFound desc = could not find container \"c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475\": container with ID starting with c0e3d48fe104a2a5a0887fb9a69110b1ed9d60b7318f143ce46e78d43cd34475 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093490 4795 scope.go:117] "RemoveContainer" containerID="10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093686 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9"} err="failed to get container status \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": rpc error: code = NotFound desc = could not find container \"10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9\": container with ID starting with 10824606305ffe613fbc0ffa0d80620145339c7c1c48f9d6f1a488179c94e3d9 not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093705 4795 scope.go:117] "RemoveContainer" containerID="83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093900 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb"} err="failed to get container status \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": rpc error: code = NotFound desc = could not find container \"83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb\": container with ID starting with 83708fa337e6318a1d59fd42901ac1f709e0e5fae8bae951e3638d30decd0ddb not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.093923 4795 scope.go:117] "RemoveContainer" containerID="cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.094201 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c"} err="failed to get container status \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": rpc error: code = NotFound desc = could not find container \"cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c\": container with ID starting with cdba0068125036886dc7c9f0783e2ec7e266b61952a375db86e2354cccab8b7c not found: ID does not exist" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.518455 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adf5bd36-b46b-4a06-8291-cae9f3988330" path="/var/lib/kubelet/pods/adf5bd36-b46b-4a06-8291-cae9f3988330/volumes" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.814321 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/2.log" Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.821487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"0c08565dc9b29a05ad8bf00031d18acac856a1b316d912f8504345e535fd4b40"} Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.821537 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"647b3f01d647f3ad581c625aa58960eb56e018d582fa5085d823401c1a126110"} Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.821560 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"6fc57746e9dcda6b57e64bceab831932e5c5f45fc6254378e692ece47caec394"} Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.821578 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"6a46a7718bb4b86f2153d5d76e61c96d00420c2e80be4c85737decd2345ccf16"} Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.821594 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"4ebfdca56cc19c6ad360bb7600c9b108012f9686b333722fa40779e9317f58d3"} Feb 19 21:39:17 crc kubenswrapper[4795]: I0219 21:39:17.821611 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"12622a6e9a1cb4219ad74d3abc47d2af2d83ea744ded01c72cf1ba7a281221be"} Feb 19 21:39:19 crc kubenswrapper[4795]: I0219 21:39:19.849102 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"ffb96d524a15ba82de35e22ce6eae23556b8100e8a34fea8ecf5781711675253"} Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.246024 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-6tsln"] Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.247202 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.250277 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.250383 4795 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-4rmf9" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.252996 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.253636 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.304592 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dc847694-39ea-4c3c-bb58-0f920e59ac62-node-mnt\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.304737 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67zfj\" (UniqueName: \"kubernetes.io/projected/dc847694-39ea-4c3c-bb58-0f920e59ac62-kube-api-access-67zfj\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.304785 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dc847694-39ea-4c3c-bb58-0f920e59ac62-crc-storage\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.406485 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67zfj\" (UniqueName: \"kubernetes.io/projected/dc847694-39ea-4c3c-bb58-0f920e59ac62-kube-api-access-67zfj\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.406559 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dc847694-39ea-4c3c-bb58-0f920e59ac62-crc-storage\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.406640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dc847694-39ea-4c3c-bb58-0f920e59ac62-node-mnt\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.407081 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dc847694-39ea-4c3c-bb58-0f920e59ac62-node-mnt\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.408153 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dc847694-39ea-4c3c-bb58-0f920e59ac62-crc-storage\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.440720 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67zfj\" (UniqueName: \"kubernetes.io/projected/dc847694-39ea-4c3c-bb58-0f920e59ac62-kube-api-access-67zfj\") pod \"crc-storage-crc-6tsln\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.572444 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: E0219 21:39:21.614805 4795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(2d9a848d2fad25e3057d21c616fc09057534181c5aaede0764f37d7275d818b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 21:39:21 crc kubenswrapper[4795]: E0219 21:39:21.615136 4795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(2d9a848d2fad25e3057d21c616fc09057534181c5aaede0764f37d7275d818b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: E0219 21:39:21.615200 4795 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(2d9a848d2fad25e3057d21c616fc09057534181c5aaede0764f37d7275d818b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:21 crc kubenswrapper[4795]: E0219 21:39:21.615272 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-6tsln_crc-storage(dc847694-39ea-4c3c-bb58-0f920e59ac62)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-6tsln_crc-storage(dc847694-39ea-4c3c-bb58-0f920e59ac62)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(2d9a848d2fad25e3057d21c616fc09057534181c5aaede0764f37d7275d818b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-6tsln" podUID="dc847694-39ea-4c3c-bb58-0f920e59ac62" Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.870026 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" event={"ID":"24d72341-090a-4e01-bc3f-9e04becb3500","Type":"ContainerStarted","Data":"b21df4c128974f22a00aac324263ae0ce26cad96082a2cfc1a7978c4b8725776"} Feb 19 21:39:21 crc kubenswrapper[4795]: I0219 21:39:21.910735 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" podStartSLOduration=5.9107118629999995 podStartE2EDuration="5.910711863s" podCreationTimestamp="2026-02-19 21:39:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:39:21.906381725 +0000 UTC m=+673.098899589" watchObservedRunningTime="2026-02-19 21:39:21.910711863 +0000 UTC m=+673.103229767" Feb 19 21:39:22 crc kubenswrapper[4795]: I0219 21:39:22.536723 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-6tsln"] Feb 19 21:39:22 crc kubenswrapper[4795]: I0219 21:39:22.536855 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:22 crc kubenswrapper[4795]: I0219 21:39:22.537259 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:22 crc kubenswrapper[4795]: E0219 21:39:22.564207 4795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(165272ba5a3d1fdfec0a57a6458884ba2762924d409002cf611a6478b4e752a3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 21:39:22 crc kubenswrapper[4795]: E0219 21:39:22.564528 4795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(165272ba5a3d1fdfec0a57a6458884ba2762924d409002cf611a6478b4e752a3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:22 crc kubenswrapper[4795]: E0219 21:39:22.564549 4795 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(165272ba5a3d1fdfec0a57a6458884ba2762924d409002cf611a6478b4e752a3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:22 crc kubenswrapper[4795]: E0219 21:39:22.564594 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-6tsln_crc-storage(dc847694-39ea-4c3c-bb58-0f920e59ac62)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-6tsln_crc-storage(dc847694-39ea-4c3c-bb58-0f920e59ac62)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(165272ba5a3d1fdfec0a57a6458884ba2762924d409002cf611a6478b4e752a3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-6tsln" podUID="dc847694-39ea-4c3c-bb58-0f920e59ac62" Feb 19 21:39:22 crc kubenswrapper[4795]: I0219 21:39:22.874637 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:22 crc kubenswrapper[4795]: I0219 21:39:22.874688 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:22 crc kubenswrapper[4795]: I0219 21:39:22.874702 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:22 crc kubenswrapper[4795]: I0219 21:39:22.898000 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:22 crc kubenswrapper[4795]: I0219 21:39:22.899794 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:29 crc kubenswrapper[4795]: I0219 21:39:29.514479 4795 scope.go:117] "RemoveContainer" containerID="2822f57eb80a87e2600f1bdd3a3189e4bded6fbc0e489210ee08ea133cbf2aa9" Feb 19 21:39:29 crc kubenswrapper[4795]: E0219 21:39:29.515039 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-5p6d9_openshift-multus(e967392b-9bd8-4111-b1b9-96d503a19668)\"" pod="openshift-multus/multus-5p6d9" podUID="e967392b-9bd8-4111-b1b9-96d503a19668" Feb 19 21:39:35 crc kubenswrapper[4795]: I0219 21:39:35.511424 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:35 crc kubenswrapper[4795]: I0219 21:39:35.512464 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:35 crc kubenswrapper[4795]: E0219 21:39:35.564839 4795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(2cac28f05bb02e95f07312d123438159c25eddb8af19222fb8034a1595da2469): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 21:39:35 crc kubenswrapper[4795]: E0219 21:39:35.564948 4795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(2cac28f05bb02e95f07312d123438159c25eddb8af19222fb8034a1595da2469): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:35 crc kubenswrapper[4795]: E0219 21:39:35.564997 4795 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(2cac28f05bb02e95f07312d123438159c25eddb8af19222fb8034a1595da2469): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:35 crc kubenswrapper[4795]: E0219 21:39:35.565091 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-6tsln_crc-storage(dc847694-39ea-4c3c-bb58-0f920e59ac62)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-6tsln_crc-storage(dc847694-39ea-4c3c-bb58-0f920e59ac62)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-6tsln_crc-storage_dc847694-39ea-4c3c-bb58-0f920e59ac62_0(2cac28f05bb02e95f07312d123438159c25eddb8af19222fb8034a1595da2469): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-6tsln" podUID="dc847694-39ea-4c3c-bb58-0f920e59ac62" Feb 19 21:39:44 crc kubenswrapper[4795]: I0219 21:39:44.511671 4795 scope.go:117] "RemoveContainer" containerID="2822f57eb80a87e2600f1bdd3a3189e4bded6fbc0e489210ee08ea133cbf2aa9" Feb 19 21:39:45 crc kubenswrapper[4795]: I0219 21:39:45.010779 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5p6d9_e967392b-9bd8-4111-b1b9-96d503a19668/kube-multus/2.log" Feb 19 21:39:45 crc kubenswrapper[4795]: I0219 21:39:45.011151 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5p6d9" event={"ID":"e967392b-9bd8-4111-b1b9-96d503a19668","Type":"ContainerStarted","Data":"4a1caddc7b9e55db86fe435872d1d75dbec41873652387af8ca72cbba985cceb"} Feb 19 21:39:46 crc kubenswrapper[4795]: I0219 21:39:46.510970 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:46 crc kubenswrapper[4795]: I0219 21:39:46.511825 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:46 crc kubenswrapper[4795]: I0219 21:39:46.580050 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pf8fb" Feb 19 21:39:46 crc kubenswrapper[4795]: I0219 21:39:46.806520 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-6tsln"] Feb 19 21:39:46 crc kubenswrapper[4795]: W0219 21:39:46.818821 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc847694_39ea_4c3c_bb58_0f920e59ac62.slice/crio-dc4caf179b98ffbf04243aa4ec489831d188da92f55af147cff3804c3f377731 WatchSource:0}: Error finding container dc4caf179b98ffbf04243aa4ec489831d188da92f55af147cff3804c3f377731: Status 404 returned error can't find the container with id dc4caf179b98ffbf04243aa4ec489831d188da92f55af147cff3804c3f377731 Feb 19 21:39:46 crc kubenswrapper[4795]: I0219 21:39:46.821617 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 21:39:47 crc kubenswrapper[4795]: I0219 21:39:47.021996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6tsln" event={"ID":"dc847694-39ea-4c3c-bb58-0f920e59ac62","Type":"ContainerStarted","Data":"dc4caf179b98ffbf04243aa4ec489831d188da92f55af147cff3804c3f377731"} Feb 19 21:39:49 crc kubenswrapper[4795]: I0219 21:39:49.034224 4795 generic.go:334] "Generic (PLEG): container finished" podID="dc847694-39ea-4c3c-bb58-0f920e59ac62" containerID="1e723054d3a11aa58d4ff2996b03eae32a1cee8cb004d6442cd324f9c14218e2" exitCode=0 Feb 19 21:39:49 crc kubenswrapper[4795]: I0219 21:39:49.034319 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6tsln" event={"ID":"dc847694-39ea-4c3c-bb58-0f920e59ac62","Type":"ContainerDied","Data":"1e723054d3a11aa58d4ff2996b03eae32a1cee8cb004d6442cd324f9c14218e2"} Feb 19 21:39:50 crc kubenswrapper[4795]: I0219 21:39:50.970425 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.028698 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dc847694-39ea-4c3c-bb58-0f920e59ac62-node-mnt\") pod \"dc847694-39ea-4c3c-bb58-0f920e59ac62\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.028770 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67zfj\" (UniqueName: \"kubernetes.io/projected/dc847694-39ea-4c3c-bb58-0f920e59ac62-kube-api-access-67zfj\") pod \"dc847694-39ea-4c3c-bb58-0f920e59ac62\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.028898 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dc847694-39ea-4c3c-bb58-0f920e59ac62-crc-storage\") pod \"dc847694-39ea-4c3c-bb58-0f920e59ac62\" (UID: \"dc847694-39ea-4c3c-bb58-0f920e59ac62\") " Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.028924 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc847694-39ea-4c3c-bb58-0f920e59ac62-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "dc847694-39ea-4c3c-bb58-0f920e59ac62" (UID: "dc847694-39ea-4c3c-bb58-0f920e59ac62"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.029282 4795 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/dc847694-39ea-4c3c-bb58-0f920e59ac62-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.033899 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc847694-39ea-4c3c-bb58-0f920e59ac62-kube-api-access-67zfj" (OuterVolumeSpecName: "kube-api-access-67zfj") pod "dc847694-39ea-4c3c-bb58-0f920e59ac62" (UID: "dc847694-39ea-4c3c-bb58-0f920e59ac62"). InnerVolumeSpecName "kube-api-access-67zfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.042256 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc847694-39ea-4c3c-bb58-0f920e59ac62-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "dc847694-39ea-4c3c-bb58-0f920e59ac62" (UID: "dc847694-39ea-4c3c-bb58-0f920e59ac62"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.046616 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6tsln" event={"ID":"dc847694-39ea-4c3c-bb58-0f920e59ac62","Type":"ContainerDied","Data":"dc4caf179b98ffbf04243aa4ec489831d188da92f55af147cff3804c3f377731"} Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.046650 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc4caf179b98ffbf04243aa4ec489831d188da92f55af147cff3804c3f377731" Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.046699 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6tsln" Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.129683 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67zfj\" (UniqueName: \"kubernetes.io/projected/dc847694-39ea-4c3c-bb58-0f920e59ac62-kube-api-access-67zfj\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:51 crc kubenswrapper[4795]: I0219 21:39:51.129712 4795 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/dc847694-39ea-4c3c-bb58-0f920e59ac62-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.427785 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.428358 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.861729 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72"] Feb 19 21:39:58 crc kubenswrapper[4795]: E0219 21:39:58.861954 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc847694-39ea-4c3c-bb58-0f920e59ac62" containerName="storage" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.861972 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc847694-39ea-4c3c-bb58-0f920e59ac62" containerName="storage" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.862090 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc847694-39ea-4c3c-bb58-0f920e59ac62" containerName="storage" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.862754 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.864831 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.869826 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72"] Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.923481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.923528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9mhs\" (UniqueName: \"kubernetes.io/projected/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-kube-api-access-v9mhs\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:58 crc kubenswrapper[4795]: I0219 21:39:58.923551 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:59 crc kubenswrapper[4795]: I0219 21:39:59.024073 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:59 crc kubenswrapper[4795]: I0219 21:39:59.024129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9mhs\" (UniqueName: \"kubernetes.io/projected/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-kube-api-access-v9mhs\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:59 crc kubenswrapper[4795]: I0219 21:39:59.024160 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:59 crc kubenswrapper[4795]: I0219 21:39:59.024671 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:59 crc kubenswrapper[4795]: I0219 21:39:59.024696 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:59 crc kubenswrapper[4795]: I0219 21:39:59.041496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9mhs\" (UniqueName: \"kubernetes.io/projected/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-kube-api-access-v9mhs\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:59 crc kubenswrapper[4795]: I0219 21:39:59.178818 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:39:59 crc kubenswrapper[4795]: I0219 21:39:59.335270 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72"] Feb 19 21:40:00 crc kubenswrapper[4795]: I0219 21:40:00.095722 4795 generic.go:334] "Generic (PLEG): container finished" podID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerID="8f8f73f759d7873ea56e90568186bc9854f950930e7d81d6f7da9012a8ba5393" exitCode=0 Feb 19 21:40:00 crc kubenswrapper[4795]: I0219 21:40:00.095770 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" event={"ID":"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a","Type":"ContainerDied","Data":"8f8f73f759d7873ea56e90568186bc9854f950930e7d81d6f7da9012a8ba5393"} Feb 19 21:40:00 crc kubenswrapper[4795]: I0219 21:40:00.095828 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" event={"ID":"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a","Type":"ContainerStarted","Data":"164513c4517cddc9aa0b1b5c63c226485f295baefd630938b695cfd17f59c08a"} Feb 19 21:40:02 crc kubenswrapper[4795]: I0219 21:40:02.105950 4795 generic.go:334] "Generic (PLEG): container finished" podID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerID="cd7a2b9ac2ff5a35837c99c829afcd08ccf0c370a64c862942b5af3fe6d51acd" exitCode=0 Feb 19 21:40:02 crc kubenswrapper[4795]: I0219 21:40:02.106047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" event={"ID":"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a","Type":"ContainerDied","Data":"cd7a2b9ac2ff5a35837c99c829afcd08ccf0c370a64c862942b5af3fe6d51acd"} Feb 19 21:40:03 crc kubenswrapper[4795]: I0219 21:40:03.118730 4795 generic.go:334] "Generic (PLEG): container finished" podID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerID="593a16368bd401d1c61c0f86210f64969596b90ed4236c8a754ef8fe915a191a" exitCode=0 Feb 19 21:40:03 crc kubenswrapper[4795]: I0219 21:40:03.118813 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" event={"ID":"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a","Type":"ContainerDied","Data":"593a16368bd401d1c61c0f86210f64969596b90ed4236c8a754ef8fe915a191a"} Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.318780 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.388738 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9mhs\" (UniqueName: \"kubernetes.io/projected/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-kube-api-access-v9mhs\") pod \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.388855 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-util\") pod \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.388923 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-bundle\") pod \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\" (UID: \"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a\") " Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.389676 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-bundle" (OuterVolumeSpecName: "bundle") pod "bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" (UID: "bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.394589 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-kube-api-access-v9mhs" (OuterVolumeSpecName: "kube-api-access-v9mhs") pod "bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" (UID: "bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a"). InnerVolumeSpecName "kube-api-access-v9mhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.408185 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-util" (OuterVolumeSpecName: "util") pod "bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" (UID: "bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.489874 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9mhs\" (UniqueName: \"kubernetes.io/projected/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-kube-api-access-v9mhs\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.489904 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-util\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:04 crc kubenswrapper[4795]: I0219 21:40:04.489917 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:05 crc kubenswrapper[4795]: I0219 21:40:05.131280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" event={"ID":"bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a","Type":"ContainerDied","Data":"164513c4517cddc9aa0b1b5c63c226485f295baefd630938b695cfd17f59c08a"} Feb 19 21:40:05 crc kubenswrapper[4795]: I0219 21:40:05.131328 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72" Feb 19 21:40:05 crc kubenswrapper[4795]: I0219 21:40:05.131322 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="164513c4517cddc9aa0b1b5c63c226485f295baefd630938b695cfd17f59c08a" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.586708 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rzkrx"] Feb 19 21:40:07 crc kubenswrapper[4795]: E0219 21:40:07.587178 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerName="extract" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.587190 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerName="extract" Feb 19 21:40:07 crc kubenswrapper[4795]: E0219 21:40:07.587202 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerName="util" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.587208 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerName="util" Feb 19 21:40:07 crc kubenswrapper[4795]: E0219 21:40:07.587217 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerName="pull" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.587223 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerName="pull" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.587318 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a" containerName="extract" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.587709 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-rzkrx" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.591493 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.591787 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.592017 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-47nj5" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.601872 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rzkrx"] Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.632755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqgzl\" (UniqueName: \"kubernetes.io/projected/6b614198-6804-46a3-bb1e-d8495c0d53d6-kube-api-access-mqgzl\") pod \"nmstate-operator-694c9596b7-rzkrx\" (UID: \"6b614198-6804-46a3-bb1e-d8495c0d53d6\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rzkrx" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.733927 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqgzl\" (UniqueName: \"kubernetes.io/projected/6b614198-6804-46a3-bb1e-d8495c0d53d6-kube-api-access-mqgzl\") pod \"nmstate-operator-694c9596b7-rzkrx\" (UID: \"6b614198-6804-46a3-bb1e-d8495c0d53d6\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rzkrx" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.751126 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqgzl\" (UniqueName: \"kubernetes.io/projected/6b614198-6804-46a3-bb1e-d8495c0d53d6-kube-api-access-mqgzl\") pod \"nmstate-operator-694c9596b7-rzkrx\" (UID: \"6b614198-6804-46a3-bb1e-d8495c0d53d6\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-rzkrx" Feb 19 21:40:07 crc kubenswrapper[4795]: I0219 21:40:07.901836 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-rzkrx" Feb 19 21:40:08 crc kubenswrapper[4795]: I0219 21:40:08.302316 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-rzkrx"] Feb 19 21:40:08 crc kubenswrapper[4795]: W0219 21:40:08.307155 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b614198_6804_46a3_bb1e_d8495c0d53d6.slice/crio-54395a4aa981739ddd50eb4d29527f4dc4b645d1e35236d8afd23037dbc3e977 WatchSource:0}: Error finding container 54395a4aa981739ddd50eb4d29527f4dc4b645d1e35236d8afd23037dbc3e977: Status 404 returned error can't find the container with id 54395a4aa981739ddd50eb4d29527f4dc4b645d1e35236d8afd23037dbc3e977 Feb 19 21:40:09 crc kubenswrapper[4795]: I0219 21:40:09.150023 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-rzkrx" event={"ID":"6b614198-6804-46a3-bb1e-d8495c0d53d6","Type":"ContainerStarted","Data":"54395a4aa981739ddd50eb4d29527f4dc4b645d1e35236d8afd23037dbc3e977"} Feb 19 21:40:11 crc kubenswrapper[4795]: I0219 21:40:11.162365 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-rzkrx" event={"ID":"6b614198-6804-46a3-bb1e-d8495c0d53d6","Type":"ContainerStarted","Data":"f7595c627af0267910b3cc1c0692dd57c951900bf3d553d3838e63683c58ab68"} Feb 19 21:40:11 crc kubenswrapper[4795]: I0219 21:40:11.179635 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-rzkrx" podStartSLOduration=2.330833938 podStartE2EDuration="4.179621043s" podCreationTimestamp="2026-02-19 21:40:07 +0000 UTC" firstStartedPulling="2026-02-19 21:40:08.309504595 +0000 UTC m=+719.502022459" lastFinishedPulling="2026-02-19 21:40:10.1582917 +0000 UTC m=+721.350809564" observedRunningTime="2026-02-19 21:40:11.17807398 +0000 UTC m=+722.370591854" watchObservedRunningTime="2026-02-19 21:40:11.179621043 +0000 UTC m=+722.372138907" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.091949 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.093063 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.103525 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-dpglh" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.109946 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.144968 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.145805 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.147305 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.149508 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.155047 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-zqk47"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.155694 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.184132 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-ovs-socket\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.184171 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tfl8\" (UniqueName: \"kubernetes.io/projected/1dfc7b5c-9302-4774-a6c8-e76ff4d60385-kube-api-access-4tfl8\") pod \"nmstate-metrics-58c85c668d-kgnfd\" (UID: \"1dfc7b5c-9302-4774-a6c8-e76ff4d60385\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.184229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-nmstate-lock\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.184248 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6c89273b-007f-44e6-88da-f48de3a5f03b-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-nk7zb\" (UID: \"6c89273b-007f-44e6-88da-f48de3a5f03b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.184287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxs5c\" (UniqueName: \"kubernetes.io/projected/6c89273b-007f-44e6-88da-f48de3a5f03b-kube-api-access-lxs5c\") pod \"nmstate-webhook-866bcb46dc-nk7zb\" (UID: \"6c89273b-007f-44e6-88da-f48de3a5f03b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.184383 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-dbus-socket\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.184487 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsgtd\" (UniqueName: \"kubernetes.io/projected/06d09723-c7bd-422c-b447-70dee244cc05-kube-api-access-hsgtd\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.253920 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.254698 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.257178 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.262389 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.264159 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-6rsrp" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285177 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxs5c\" (UniqueName: \"kubernetes.io/projected/6c89273b-007f-44e6-88da-f48de3a5f03b-kube-api-access-lxs5c\") pod \"nmstate-webhook-866bcb46dc-nk7zb\" (UID: \"6c89273b-007f-44e6-88da-f48de3a5f03b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285246 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-dbus-socket\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285264 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsgtd\" (UniqueName: \"kubernetes.io/projected/06d09723-c7bd-422c-b447-70dee244cc05-kube-api-access-hsgtd\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285301 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285327 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwgjv\" (UniqueName: \"kubernetes.io/projected/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-kube-api-access-vwgjv\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285345 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285369 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-ovs-socket\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tfl8\" (UniqueName: \"kubernetes.io/projected/1dfc7b5c-9302-4774-a6c8-e76ff4d60385-kube-api-access-4tfl8\") pod \"nmstate-metrics-58c85c668d-kgnfd\" (UID: \"1dfc7b5c-9302-4774-a6c8-e76ff4d60385\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285431 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-nmstate-lock\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285449 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6c89273b-007f-44e6-88da-f48de3a5f03b-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-nk7zb\" (UID: \"6c89273b-007f-44e6-88da-f48de3a5f03b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285676 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-ovs-socket\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-dbus-socket\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.285842 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/06d09723-c7bd-422c-b447-70dee244cc05-nmstate-lock\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.293601 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6c89273b-007f-44e6-88da-f48de3a5f03b-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-nk7zb\" (UID: \"6c89273b-007f-44e6-88da-f48de3a5f03b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.300911 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.304721 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsgtd\" (UniqueName: \"kubernetes.io/projected/06d09723-c7bd-422c-b447-70dee244cc05-kube-api-access-hsgtd\") pod \"nmstate-handler-zqk47\" (UID: \"06d09723-c7bd-422c-b447-70dee244cc05\") " pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.305413 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxs5c\" (UniqueName: \"kubernetes.io/projected/6c89273b-007f-44e6-88da-f48de3a5f03b-kube-api-access-lxs5c\") pod \"nmstate-webhook-866bcb46dc-nk7zb\" (UID: \"6c89273b-007f-44e6-88da-f48de3a5f03b\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.315997 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tfl8\" (UniqueName: \"kubernetes.io/projected/1dfc7b5c-9302-4774-a6c8-e76ff4d60385-kube-api-access-4tfl8\") pod \"nmstate-metrics-58c85c668d-kgnfd\" (UID: \"1dfc7b5c-9302-4774-a6c8-e76ff4d60385\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.386147 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwgjv\" (UniqueName: \"kubernetes.io/projected/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-kube-api-access-vwgjv\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.386205 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.386272 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.387035 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: E0219 21:40:12.387109 4795 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 19 21:40:12 crc kubenswrapper[4795]: E0219 21:40:12.387153 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-plugin-serving-cert podName:f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8 nodeName:}" failed. No retries permitted until 2026-02-19 21:40:12.887138833 +0000 UTC m=+724.079656697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-gp2td" (UID: "f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8") : secret "plugin-serving-cert" not found Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.402343 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwgjv\" (UniqueName: \"kubernetes.io/projected/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-kube-api-access-vwgjv\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.446638 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.453730 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6699bdbc6b-7fbd7"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.454341 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.483382 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.484128 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6699bdbc6b-7fbd7"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.496514 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:12 crc kubenswrapper[4795]: W0219 21:40:12.558115 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06d09723_c7bd_422c_b447_70dee244cc05.slice/crio-72c964e7fb5502abf39fea516705b245fe7b7c2087a640f5fdde882b2b1bb29b WatchSource:0}: Error finding container 72c964e7fb5502abf39fea516705b245fe7b7c2087a640f5fdde882b2b1bb29b: Status 404 returned error can't find the container with id 72c964e7fb5502abf39fea516705b245fe7b7c2087a640f5fdde882b2b1bb29b Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.589663 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-service-ca\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.589700 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55f88401-be34-4486-8e8c-4c3be51ab251-console-serving-cert\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.589737 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-trusted-ca-bundle\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.589762 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-console-config\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.589784 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5l2q\" (UniqueName: \"kubernetes.io/projected/55f88401-be34-4486-8e8c-4c3be51ab251-kube-api-access-n5l2q\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.589807 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55f88401-be34-4486-8e8c-4c3be51ab251-console-oauth-config\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.589822 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-oauth-serving-cert\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.665235 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd"] Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.690900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-service-ca\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.691270 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55f88401-be34-4486-8e8c-4c3be51ab251-console-serving-cert\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.691305 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-trusted-ca-bundle\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.691330 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-console-config\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.691352 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5l2q\" (UniqueName: \"kubernetes.io/projected/55f88401-be34-4486-8e8c-4c3be51ab251-kube-api-access-n5l2q\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.691378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55f88401-be34-4486-8e8c-4c3be51ab251-console-oauth-config\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.691405 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-oauth-serving-cert\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.691744 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-service-ca\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.692109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-oauth-serving-cert\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.692620 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-console-config\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.693115 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55f88401-be34-4486-8e8c-4c3be51ab251-trusted-ca-bundle\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.696423 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55f88401-be34-4486-8e8c-4c3be51ab251-console-oauth-config\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.700743 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55f88401-be34-4486-8e8c-4c3be51ab251-console-serving-cert\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.705939 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5l2q\" (UniqueName: \"kubernetes.io/projected/55f88401-be34-4486-8e8c-4c3be51ab251-kube-api-access-n5l2q\") pod \"console-6699bdbc6b-7fbd7\" (UID: \"55f88401-be34-4486-8e8c-4c3be51ab251\") " pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.802942 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.893331 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.897929 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-gp2td\" (UID: \"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.927929 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb"] Feb 19 21:40:12 crc kubenswrapper[4795]: W0219 21:40:12.944979 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c89273b_007f_44e6_88da_f48de3a5f03b.slice/crio-5613366bc7262aa9b87e14088524bc21ac6efc15f0a0bba4fefbf1e99942e393 WatchSource:0}: Error finding container 5613366bc7262aa9b87e14088524bc21ac6efc15f0a0bba4fefbf1e99942e393: Status 404 returned error can't find the container with id 5613366bc7262aa9b87e14088524bc21ac6efc15f0a0bba4fefbf1e99942e393 Feb 19 21:40:12 crc kubenswrapper[4795]: I0219 21:40:12.984992 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6699bdbc6b-7fbd7"] Feb 19 21:40:12 crc kubenswrapper[4795]: W0219 21:40:12.990416 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55f88401_be34_4486_8e8c_4c3be51ab251.slice/crio-060a5e633139d5351f0cbb0878548a867828a9586a7491786a3d1147e8abd5b4 WatchSource:0}: Error finding container 060a5e633139d5351f0cbb0878548a867828a9586a7491786a3d1147e8abd5b4: Status 404 returned error can't find the container with id 060a5e633139d5351f0cbb0878548a867828a9586a7491786a3d1147e8abd5b4 Feb 19 21:40:13 crc kubenswrapper[4795]: I0219 21:40:13.166660 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" Feb 19 21:40:13 crc kubenswrapper[4795]: I0219 21:40:13.172705 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" event={"ID":"6c89273b-007f-44e6-88da-f48de3a5f03b","Type":"ContainerStarted","Data":"5613366bc7262aa9b87e14088524bc21ac6efc15f0a0bba4fefbf1e99942e393"} Feb 19 21:40:13 crc kubenswrapper[4795]: I0219 21:40:13.174468 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zqk47" event={"ID":"06d09723-c7bd-422c-b447-70dee244cc05","Type":"ContainerStarted","Data":"72c964e7fb5502abf39fea516705b245fe7b7c2087a640f5fdde882b2b1bb29b"} Feb 19 21:40:13 crc kubenswrapper[4795]: I0219 21:40:13.177213 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6699bdbc6b-7fbd7" event={"ID":"55f88401-be34-4486-8e8c-4c3be51ab251","Type":"ContainerStarted","Data":"0782a73adbea456f2d44b5878f555fc1836a9ec6d638da94a76199740bf9627d"} Feb 19 21:40:13 crc kubenswrapper[4795]: I0219 21:40:13.177252 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6699bdbc6b-7fbd7" event={"ID":"55f88401-be34-4486-8e8c-4c3be51ab251","Type":"ContainerStarted","Data":"060a5e633139d5351f0cbb0878548a867828a9586a7491786a3d1147e8abd5b4"} Feb 19 21:40:13 crc kubenswrapper[4795]: I0219 21:40:13.180131 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" event={"ID":"1dfc7b5c-9302-4774-a6c8-e76ff4d60385","Type":"ContainerStarted","Data":"d669bb293befeeb26b605590e515a05af3aaf925a4aa01dfd16bce44738deef4"} Feb 19 21:40:13 crc kubenswrapper[4795]: I0219 21:40:13.193239 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6699bdbc6b-7fbd7" podStartSLOduration=1.193219206 podStartE2EDuration="1.193219206s" podCreationTimestamp="2026-02-19 21:40:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:40:13.192558537 +0000 UTC m=+724.385076401" watchObservedRunningTime="2026-02-19 21:40:13.193219206 +0000 UTC m=+724.385737060" Feb 19 21:40:13 crc kubenswrapper[4795]: I0219 21:40:13.364644 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td"] Feb 19 21:40:14 crc kubenswrapper[4795]: I0219 21:40:14.195035 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" event={"ID":"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8","Type":"ContainerStarted","Data":"fe32af036e454241c567cc8e0f61e0866559d9a7d3a4502e65dcb281e729761d"} Feb 19 21:40:15 crc kubenswrapper[4795]: I0219 21:40:15.206909 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" event={"ID":"1dfc7b5c-9302-4774-a6c8-e76ff4d60385","Type":"ContainerStarted","Data":"61115fc7fcf512e5a26159f193db4580d5a6a7507403b0bc4169c12cc951ca11"} Feb 19 21:40:15 crc kubenswrapper[4795]: I0219 21:40:15.208971 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" event={"ID":"6c89273b-007f-44e6-88da-f48de3a5f03b","Type":"ContainerStarted","Data":"42355c608a117e5b0c1f3f0b421dd26ea3d75e379af3aeb7703f92cd5be3c557"} Feb 19 21:40:15 crc kubenswrapper[4795]: I0219 21:40:15.209123 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:15 crc kubenswrapper[4795]: I0219 21:40:15.210698 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zqk47" event={"ID":"06d09723-c7bd-422c-b447-70dee244cc05","Type":"ContainerStarted","Data":"d3f3ab5956da1b5c9ba5705b7eb80eda83b416a4d638ed678f93df9fc6895974"} Feb 19 21:40:15 crc kubenswrapper[4795]: I0219 21:40:15.210846 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:15 crc kubenswrapper[4795]: I0219 21:40:15.225943 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" podStartSLOduration=1.2471275259999999 podStartE2EDuration="3.225925735s" podCreationTimestamp="2026-02-19 21:40:12 +0000 UTC" firstStartedPulling="2026-02-19 21:40:12.948430229 +0000 UTC m=+724.140948093" lastFinishedPulling="2026-02-19 21:40:14.927228428 +0000 UTC m=+726.119746302" observedRunningTime="2026-02-19 21:40:15.22465545 +0000 UTC m=+726.417173314" watchObservedRunningTime="2026-02-19 21:40:15.225925735 +0000 UTC m=+726.418443599" Feb 19 21:40:15 crc kubenswrapper[4795]: I0219 21:40:15.250253 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-zqk47" podStartSLOduration=0.92566856 podStartE2EDuration="3.250230097s" podCreationTimestamp="2026-02-19 21:40:12 +0000 UTC" firstStartedPulling="2026-02-19 21:40:12.559764275 +0000 UTC m=+723.752282139" lastFinishedPulling="2026-02-19 21:40:14.884325812 +0000 UTC m=+726.076843676" observedRunningTime="2026-02-19 21:40:15.243333596 +0000 UTC m=+726.435851470" watchObservedRunningTime="2026-02-19 21:40:15.250230097 +0000 UTC m=+726.442747961" Feb 19 21:40:16 crc kubenswrapper[4795]: I0219 21:40:16.221663 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" event={"ID":"f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8","Type":"ContainerStarted","Data":"b11adb14302c0d4d392ad94899855264089be2fef7f39e171d5e437b5cbf641d"} Feb 19 21:40:16 crc kubenswrapper[4795]: I0219 21:40:16.244407 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gp2td" podStartSLOduration=1.845439686 podStartE2EDuration="4.244382259s" podCreationTimestamp="2026-02-19 21:40:12 +0000 UTC" firstStartedPulling="2026-02-19 21:40:13.370729533 +0000 UTC m=+724.563247397" lastFinishedPulling="2026-02-19 21:40:15.769672096 +0000 UTC m=+726.962189970" observedRunningTime="2026-02-19 21:40:16.235543724 +0000 UTC m=+727.428061608" watchObservedRunningTime="2026-02-19 21:40:16.244382259 +0000 UTC m=+727.436900123" Feb 19 21:40:17 crc kubenswrapper[4795]: I0219 21:40:17.226899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" event={"ID":"1dfc7b5c-9302-4774-a6c8-e76ff4d60385","Type":"ContainerStarted","Data":"21cf6b304361dbb86a91906186b1f42903ce3f0af0e47747fd83a53d2746c9b0"} Feb 19 21:40:17 crc kubenswrapper[4795]: I0219 21:40:17.244548 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-kgnfd" podStartSLOduration=1.193796351 podStartE2EDuration="5.244533455s" podCreationTimestamp="2026-02-19 21:40:12 +0000 UTC" firstStartedPulling="2026-02-19 21:40:12.675914956 +0000 UTC m=+723.868432820" lastFinishedPulling="2026-02-19 21:40:16.72665207 +0000 UTC m=+727.919169924" observedRunningTime="2026-02-19 21:40:17.240612437 +0000 UTC m=+728.433130321" watchObservedRunningTime="2026-02-19 21:40:17.244533455 +0000 UTC m=+728.437051319" Feb 19 21:40:22 crc kubenswrapper[4795]: I0219 21:40:22.529637 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-zqk47" Feb 19 21:40:22 crc kubenswrapper[4795]: I0219 21:40:22.803649 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:22 crc kubenswrapper[4795]: I0219 21:40:22.804240 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:22 crc kubenswrapper[4795]: I0219 21:40:22.811255 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:23 crc kubenswrapper[4795]: I0219 21:40:23.273588 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6699bdbc6b-7fbd7" Feb 19 21:40:23 crc kubenswrapper[4795]: I0219 21:40:23.336893 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rvkhj"] Feb 19 21:40:28 crc kubenswrapper[4795]: I0219 21:40:28.427277 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:40:28 crc kubenswrapper[4795]: I0219 21:40:28.427651 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:40:32 crc kubenswrapper[4795]: I0219 21:40:32.490308 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nk7zb" Feb 19 21:40:42 crc kubenswrapper[4795]: I0219 21:40:42.249212 4795 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.342143 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj"] Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.343862 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.346927 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.358015 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj"] Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.445807 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.445888 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.446033 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7zj2\" (UniqueName: \"kubernetes.io/projected/8f281e72-3a5e-4abb-bbcb-7555808866be-kube-api-access-s7zj2\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.547475 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.547530 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.547576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7zj2\" (UniqueName: \"kubernetes.io/projected/8f281e72-3a5e-4abb-bbcb-7555808866be-kube-api-access-s7zj2\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.547944 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.548142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.566475 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7zj2\" (UniqueName: \"kubernetes.io/projected/8f281e72-3a5e-4abb-bbcb-7555808866be-kube-api-access-s7zj2\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.661214 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:46 crc kubenswrapper[4795]: I0219 21:40:46.831653 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj"] Feb 19 21:40:47 crc kubenswrapper[4795]: I0219 21:40:47.405901 4795 generic.go:334] "Generic (PLEG): container finished" podID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerID="8e8e52ed11c438f4bee44742883e14e18265ed4eb79cf0379b3f52cfd53cb52b" exitCode=0 Feb 19 21:40:47 crc kubenswrapper[4795]: I0219 21:40:47.405993 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" event={"ID":"8f281e72-3a5e-4abb-bbcb-7555808866be","Type":"ContainerDied","Data":"8e8e52ed11c438f4bee44742883e14e18265ed4eb79cf0379b3f52cfd53cb52b"} Feb 19 21:40:47 crc kubenswrapper[4795]: I0219 21:40:47.406374 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" event={"ID":"8f281e72-3a5e-4abb-bbcb-7555808866be","Type":"ContainerStarted","Data":"d5f862a402ff333110036da6599ba414ec2036ba5942154c2f4bc5a556c0dcb8"} Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.398238 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-rvkhj" podUID="ec60d287-0f21-467c-8030-84b8726af567" containerName="console" containerID="cri-o://4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463" gracePeriod=15 Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.704444 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zmvnt"] Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.706065 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.711318 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zmvnt"] Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.773686 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rvkhj_ec60d287-0f21-467c-8030-84b8726af567/console/0.log" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.773768 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.800794 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-utilities\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.800856 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-catalog-content\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.800888 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsv5f\" (UniqueName: \"kubernetes.io/projected/4118207c-5a68-4979-b9b6-eb22b17052b5-kube-api-access-rsv5f\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902013 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-console-config\") pod \"ec60d287-0f21-467c-8030-84b8726af567\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902352 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-trusted-ca-bundle\") pod \"ec60d287-0f21-467c-8030-84b8726af567\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902451 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-service-ca\") pod \"ec60d287-0f21-467c-8030-84b8726af567\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902483 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-oauth-serving-cert\") pod \"ec60d287-0f21-467c-8030-84b8726af567\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902508 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-oauth-config\") pod \"ec60d287-0f21-467c-8030-84b8726af567\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902542 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p5td\" (UniqueName: \"kubernetes.io/projected/ec60d287-0f21-467c-8030-84b8726af567-kube-api-access-8p5td\") pod \"ec60d287-0f21-467c-8030-84b8726af567\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902582 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-serving-cert\") pod \"ec60d287-0f21-467c-8030-84b8726af567\" (UID: \"ec60d287-0f21-467c-8030-84b8726af567\") " Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902752 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-utilities\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902765 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-console-config" (OuterVolumeSpecName: "console-config") pod "ec60d287-0f21-467c-8030-84b8726af567" (UID: "ec60d287-0f21-467c-8030-84b8726af567"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902881 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-catalog-content\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.902972 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsv5f\" (UniqueName: \"kubernetes.io/projected/4118207c-5a68-4979-b9b6-eb22b17052b5-kube-api-access-rsv5f\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.903180 4795 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.903293 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ec60d287-0f21-467c-8030-84b8726af567" (UID: "ec60d287-0f21-467c-8030-84b8726af567"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.903318 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-utilities\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.903310 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ec60d287-0f21-467c-8030-84b8726af567" (UID: "ec60d287-0f21-467c-8030-84b8726af567"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.903331 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-service-ca" (OuterVolumeSpecName: "service-ca") pod "ec60d287-0f21-467c-8030-84b8726af567" (UID: "ec60d287-0f21-467c-8030-84b8726af567"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.903426 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-catalog-content\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.908045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ec60d287-0f21-467c-8030-84b8726af567" (UID: "ec60d287-0f21-467c-8030-84b8726af567"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.908129 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec60d287-0f21-467c-8030-84b8726af567-kube-api-access-8p5td" (OuterVolumeSpecName: "kube-api-access-8p5td") pod "ec60d287-0f21-467c-8030-84b8726af567" (UID: "ec60d287-0f21-467c-8030-84b8726af567"). InnerVolumeSpecName "kube-api-access-8p5td". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.910250 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ec60d287-0f21-467c-8030-84b8726af567" (UID: "ec60d287-0f21-467c-8030-84b8726af567"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:40:48 crc kubenswrapper[4795]: I0219 21:40:48.918453 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsv5f\" (UniqueName: \"kubernetes.io/projected/4118207c-5a68-4979-b9b6-eb22b17052b5-kube-api-access-rsv5f\") pod \"redhat-operators-zmvnt\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.004523 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.004561 4795 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.004575 4795 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.004585 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p5td\" (UniqueName: \"kubernetes.io/projected/ec60d287-0f21-467c-8030-84b8726af567-kube-api-access-8p5td\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.004594 4795 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec60d287-0f21-467c-8030-84b8726af567-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.004601 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec60d287-0f21-467c-8030-84b8726af567-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.070774 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.256878 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zmvnt"] Feb 19 21:40:49 crc kubenswrapper[4795]: W0219 21:40:49.264353 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4118207c_5a68_4979_b9b6_eb22b17052b5.slice/crio-c3720e6494d7da9d6757edc6343c06ba78fe938b5670ae300d8ec58fba2ab684 WatchSource:0}: Error finding container c3720e6494d7da9d6757edc6343c06ba78fe938b5670ae300d8ec58fba2ab684: Status 404 returned error can't find the container with id c3720e6494d7da9d6757edc6343c06ba78fe938b5670ae300d8ec58fba2ab684 Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.417509 4795 generic.go:334] "Generic (PLEG): container finished" podID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerID="282254e9f9d949e94e17360f3140e13e0f05261314c0d710cebe2dcd8af745c8" exitCode=0 Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.417678 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" event={"ID":"8f281e72-3a5e-4abb-bbcb-7555808866be","Type":"ContainerDied","Data":"282254e9f9d949e94e17360f3140e13e0f05261314c0d710cebe2dcd8af745c8"} Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.419229 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rvkhj_ec60d287-0f21-467c-8030-84b8726af567/console/0.log" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.419267 4795 generic.go:334] "Generic (PLEG): container finished" podID="ec60d287-0f21-467c-8030-84b8726af567" containerID="4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463" exitCode=2 Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.419334 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rvkhj" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.419359 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rvkhj" event={"ID":"ec60d287-0f21-467c-8030-84b8726af567","Type":"ContainerDied","Data":"4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463"} Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.419398 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rvkhj" event={"ID":"ec60d287-0f21-467c-8030-84b8726af567","Type":"ContainerDied","Data":"01605015262ec0d283a1299b16fa7df4e9785d87441123f54856fb5a6f2abf61"} Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.419421 4795 scope.go:117] "RemoveContainer" containerID="4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.422216 4795 generic.go:334] "Generic (PLEG): container finished" podID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerID="c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd" exitCode=0 Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.422258 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvnt" event={"ID":"4118207c-5a68-4979-b9b6-eb22b17052b5","Type":"ContainerDied","Data":"c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd"} Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.422272 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvnt" event={"ID":"4118207c-5a68-4979-b9b6-eb22b17052b5","Type":"ContainerStarted","Data":"c3720e6494d7da9d6757edc6343c06ba78fe938b5670ae300d8ec58fba2ab684"} Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.469701 4795 scope.go:117] "RemoveContainer" containerID="4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463" Feb 19 21:40:49 crc kubenswrapper[4795]: E0219 21:40:49.470054 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463\": container with ID starting with 4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463 not found: ID does not exist" containerID="4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.470083 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463"} err="failed to get container status \"4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463\": rpc error: code = NotFound desc = could not find container \"4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463\": container with ID starting with 4035d8859f231da873e32f045fcb444bd306ddab0862e7e765fb5b6fd021c463 not found: ID does not exist" Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.522976 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rvkhj"] Feb 19 21:40:49 crc kubenswrapper[4795]: I0219 21:40:49.525630 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-rvkhj"] Feb 19 21:40:50 crc kubenswrapper[4795]: I0219 21:40:50.430134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvnt" event={"ID":"4118207c-5a68-4979-b9b6-eb22b17052b5","Type":"ContainerStarted","Data":"0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6"} Feb 19 21:40:50 crc kubenswrapper[4795]: I0219 21:40:50.437998 4795 generic.go:334] "Generic (PLEG): container finished" podID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerID="fc35abed06fc900449f1e23dd9b72c2de42312527d8bedd50f603c4383888de5" exitCode=0 Feb 19 21:40:50 crc kubenswrapper[4795]: I0219 21:40:50.438037 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" event={"ID":"8f281e72-3a5e-4abb-bbcb-7555808866be","Type":"ContainerDied","Data":"fc35abed06fc900449f1e23dd9b72c2de42312527d8bedd50f603c4383888de5"} Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.449585 4795 generic.go:334] "Generic (PLEG): container finished" podID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerID="0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6" exitCode=0 Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.449678 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvnt" event={"ID":"4118207c-5a68-4979-b9b6-eb22b17052b5","Type":"ContainerDied","Data":"0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6"} Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.524976 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec60d287-0f21-467c-8030-84b8726af567" path="/var/lib/kubelet/pods/ec60d287-0f21-467c-8030-84b8726af567/volumes" Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.693917 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.841144 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-bundle\") pod \"8f281e72-3a5e-4abb-bbcb-7555808866be\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.841194 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-util\") pod \"8f281e72-3a5e-4abb-bbcb-7555808866be\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.841264 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7zj2\" (UniqueName: \"kubernetes.io/projected/8f281e72-3a5e-4abb-bbcb-7555808866be-kube-api-access-s7zj2\") pod \"8f281e72-3a5e-4abb-bbcb-7555808866be\" (UID: \"8f281e72-3a5e-4abb-bbcb-7555808866be\") " Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.843316 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-bundle" (OuterVolumeSpecName: "bundle") pod "8f281e72-3a5e-4abb-bbcb-7555808866be" (UID: "8f281e72-3a5e-4abb-bbcb-7555808866be"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.847596 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f281e72-3a5e-4abb-bbcb-7555808866be-kube-api-access-s7zj2" (OuterVolumeSpecName: "kube-api-access-s7zj2") pod "8f281e72-3a5e-4abb-bbcb-7555808866be" (UID: "8f281e72-3a5e-4abb-bbcb-7555808866be"). InnerVolumeSpecName "kube-api-access-s7zj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.864047 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-util" (OuterVolumeSpecName: "util") pod "8f281e72-3a5e-4abb-bbcb-7555808866be" (UID: "8f281e72-3a5e-4abb-bbcb-7555808866be"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.942325 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.942819 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f281e72-3a5e-4abb-bbcb-7555808866be-util\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:51 crc kubenswrapper[4795]: I0219 21:40:51.942876 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7zj2\" (UniqueName: \"kubernetes.io/projected/8f281e72-3a5e-4abb-bbcb-7555808866be-kube-api-access-s7zj2\") on node \"crc\" DevicePath \"\"" Feb 19 21:40:52 crc kubenswrapper[4795]: I0219 21:40:52.458793 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvnt" event={"ID":"4118207c-5a68-4979-b9b6-eb22b17052b5","Type":"ContainerStarted","Data":"cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9"} Feb 19 21:40:52 crc kubenswrapper[4795]: I0219 21:40:52.460778 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" event={"ID":"8f281e72-3a5e-4abb-bbcb-7555808866be","Type":"ContainerDied","Data":"d5f862a402ff333110036da6599ba414ec2036ba5942154c2f4bc5a556c0dcb8"} Feb 19 21:40:52 crc kubenswrapper[4795]: I0219 21:40:52.460830 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5f862a402ff333110036da6599ba414ec2036ba5942154c2f4bc5a556c0dcb8" Feb 19 21:40:52 crc kubenswrapper[4795]: I0219 21:40:52.460933 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj" Feb 19 21:40:52 crc kubenswrapper[4795]: I0219 21:40:52.526186 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zmvnt" podStartSLOduration=2.130334313 podStartE2EDuration="4.526150821s" podCreationTimestamp="2026-02-19 21:40:48 +0000 UTC" firstStartedPulling="2026-02-19 21:40:49.429647264 +0000 UTC m=+760.622165128" lastFinishedPulling="2026-02-19 21:40:51.825463762 +0000 UTC m=+763.017981636" observedRunningTime="2026-02-19 21:40:52.519311172 +0000 UTC m=+763.711829086" watchObservedRunningTime="2026-02-19 21:40:52.526150821 +0000 UTC m=+763.718668685" Feb 19 21:40:58 crc kubenswrapper[4795]: I0219 21:40:58.427190 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:40:58 crc kubenswrapper[4795]: I0219 21:40:58.427619 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:40:58 crc kubenswrapper[4795]: I0219 21:40:58.427661 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:40:58 crc kubenswrapper[4795]: I0219 21:40:58.428210 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"01e410588eeb6332f2520524efa20c5c33620bd277e99c49543b745d9bb370ca"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:40:58 crc kubenswrapper[4795]: I0219 21:40:58.428272 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://01e410588eeb6332f2520524efa20c5c33620bd277e99c49543b745d9bb370ca" gracePeriod=600 Feb 19 21:40:59 crc kubenswrapper[4795]: I0219 21:40:59.071404 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:59 crc kubenswrapper[4795]: I0219 21:40:59.071833 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:59 crc kubenswrapper[4795]: I0219 21:40:59.111845 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:40:59 crc kubenswrapper[4795]: I0219 21:40:59.498651 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="01e410588eeb6332f2520524efa20c5c33620bd277e99c49543b745d9bb370ca" exitCode=0 Feb 19 21:40:59 crc kubenswrapper[4795]: I0219 21:40:59.498710 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"01e410588eeb6332f2520524efa20c5c33620bd277e99c49543b745d9bb370ca"} Feb 19 21:40:59 crc kubenswrapper[4795]: I0219 21:40:59.498760 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"baca31a8ff8f8b420ab1c2fee031dade1a9efccb0543c74090535aa06f41da2f"} Feb 19 21:40:59 crc kubenswrapper[4795]: I0219 21:40:59.498782 4795 scope.go:117] "RemoveContainer" containerID="7b9a6596619327bd9c2f9e2c7e01d53fcd036e297997e2604d1242871dfda04a" Feb 19 21:40:59 crc kubenswrapper[4795]: I0219 21:40:59.543095 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.499292 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zmvnt"] Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.510229 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zmvnt" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerName="registry-server" containerID="cri-o://cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9" gracePeriod=2 Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.613340 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj"] Feb 19 21:41:01 crc kubenswrapper[4795]: E0219 21:41:01.613876 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerName="pull" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.613888 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerName="pull" Feb 19 21:41:01 crc kubenswrapper[4795]: E0219 21:41:01.613904 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerName="extract" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.613910 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerName="extract" Feb 19 21:41:01 crc kubenswrapper[4795]: E0219 21:41:01.613923 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerName="util" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.613929 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerName="util" Feb 19 21:41:01 crc kubenswrapper[4795]: E0219 21:41:01.613939 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec60d287-0f21-467c-8030-84b8726af567" containerName="console" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.613945 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec60d287-0f21-467c-8030-84b8726af567" containerName="console" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.614038 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f281e72-3a5e-4abb-bbcb-7555808866be" containerName="extract" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.614053 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec60d287-0f21-467c-8030-84b8726af567" containerName="console" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.614414 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.616208 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.616211 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.616841 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.632264 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.632583 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-d4wz6" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.636367 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj"] Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.768976 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2eb889b2-1f23-4497-a779-5312fcd470b1-webhook-cert\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.769020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2eb889b2-1f23-4497-a779-5312fcd470b1-apiservice-cert\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.769051 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcbx4\" (UniqueName: \"kubernetes.io/projected/2eb889b2-1f23-4497-a779-5312fcd470b1-kube-api-access-zcbx4\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.869788 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcbx4\" (UniqueName: \"kubernetes.io/projected/2eb889b2-1f23-4497-a779-5312fcd470b1-kube-api-access-zcbx4\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.869906 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2eb889b2-1f23-4497-a779-5312fcd470b1-webhook-cert\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.869923 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2eb889b2-1f23-4497-a779-5312fcd470b1-apiservice-cert\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.875393 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2eb889b2-1f23-4497-a779-5312fcd470b1-webhook-cert\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.888933 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcbx4\" (UniqueName: \"kubernetes.io/projected/2eb889b2-1f23-4497-a779-5312fcd470b1-kube-api-access-zcbx4\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.893920 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2eb889b2-1f23-4497-a779-5312fcd470b1-apiservice-cert\") pod \"metallb-operator-controller-manager-549cf7d797-tscrj\" (UID: \"2eb889b2-1f23-4497-a779-5312fcd470b1\") " pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.965476 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.966021 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.980423 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x"] Feb 19 21:41:01 crc kubenswrapper[4795]: E0219 21:41:01.980915 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerName="extract-content" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.980930 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerName="extract-content" Feb 19 21:41:01 crc kubenswrapper[4795]: E0219 21:41:01.980944 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerName="extract-utilities" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.980951 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerName="extract-utilities" Feb 19 21:41:01 crc kubenswrapper[4795]: E0219 21:41:01.980959 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerName="registry-server" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.980965 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerName="registry-server" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.981052 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerName="registry-server" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.981391 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.983140 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.983294 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.983698 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-q6xth" Feb 19 21:41:01 crc kubenswrapper[4795]: I0219 21:41:01.993652 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x"] Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.128774 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-utilities\") pod \"4118207c-5a68-4979-b9b6-eb22b17052b5\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.129321 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-catalog-content\") pod \"4118207c-5a68-4979-b9b6-eb22b17052b5\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.129450 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsv5f\" (UniqueName: \"kubernetes.io/projected/4118207c-5a68-4979-b9b6-eb22b17052b5-kube-api-access-rsv5f\") pod \"4118207c-5a68-4979-b9b6-eb22b17052b5\" (UID: \"4118207c-5a68-4979-b9b6-eb22b17052b5\") " Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.129628 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94a7e477-a2bd-4c46-8eb0-084260fade4a-webhook-cert\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.129705 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94a7e477-a2bd-4c46-8eb0-084260fade4a-apiservice-cert\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.129869 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g457x\" (UniqueName: \"kubernetes.io/projected/94a7e477-a2bd-4c46-8eb0-084260fade4a-kube-api-access-g457x\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.129938 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-utilities" (OuterVolumeSpecName: "utilities") pod "4118207c-5a68-4979-b9b6-eb22b17052b5" (UID: "4118207c-5a68-4979-b9b6-eb22b17052b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.136425 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4118207c-5a68-4979-b9b6-eb22b17052b5-kube-api-access-rsv5f" (OuterVolumeSpecName: "kube-api-access-rsv5f") pod "4118207c-5a68-4979-b9b6-eb22b17052b5" (UID: "4118207c-5a68-4979-b9b6-eb22b17052b5"). InnerVolumeSpecName "kube-api-access-rsv5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.230285 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g457x\" (UniqueName: \"kubernetes.io/projected/94a7e477-a2bd-4c46-8eb0-084260fade4a-kube-api-access-g457x\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.230328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94a7e477-a2bd-4c46-8eb0-084260fade4a-webhook-cert\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.230359 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94a7e477-a2bd-4c46-8eb0-084260fade4a-apiservice-cert\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.230414 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsv5f\" (UniqueName: \"kubernetes.io/projected/4118207c-5a68-4979-b9b6-eb22b17052b5-kube-api-access-rsv5f\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.230424 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.234967 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/94a7e477-a2bd-4c46-8eb0-084260fade4a-apiservice-cert\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.234982 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/94a7e477-a2bd-4c46-8eb0-084260fade4a-webhook-cert\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.250726 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g457x\" (UniqueName: \"kubernetes.io/projected/94a7e477-a2bd-4c46-8eb0-084260fade4a-kube-api-access-g457x\") pod \"metallb-operator-webhook-server-7d8d766c-z8q6x\" (UID: \"94a7e477-a2bd-4c46-8eb0-084260fade4a\") " pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.292986 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4118207c-5a68-4979-b9b6-eb22b17052b5" (UID: "4118207c-5a68-4979-b9b6-eb22b17052b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.319646 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.332106 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4118207c-5a68-4979-b9b6-eb22b17052b5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.481390 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj"] Feb 19 21:41:02 crc kubenswrapper[4795]: W0219 21:41:02.488010 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eb889b2_1f23_4497_a779_5312fcd470b1.slice/crio-8fc354a2031a50acbd85479a8cf74b120b4e788591fcc72267bcca2224a37c12 WatchSource:0}: Error finding container 8fc354a2031a50acbd85479a8cf74b120b4e788591fcc72267bcca2224a37c12: Status 404 returned error can't find the container with id 8fc354a2031a50acbd85479a8cf74b120b4e788591fcc72267bcca2224a37c12 Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.523725 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" event={"ID":"2eb889b2-1f23-4497-a779-5312fcd470b1","Type":"ContainerStarted","Data":"8fc354a2031a50acbd85479a8cf74b120b4e788591fcc72267bcca2224a37c12"} Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.533354 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x"] Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.535041 4795 generic.go:334] "Generic (PLEG): container finished" podID="4118207c-5a68-4979-b9b6-eb22b17052b5" containerID="cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9" exitCode=0 Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.535071 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvnt" event={"ID":"4118207c-5a68-4979-b9b6-eb22b17052b5","Type":"ContainerDied","Data":"cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9"} Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.535094 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmvnt" event={"ID":"4118207c-5a68-4979-b9b6-eb22b17052b5","Type":"ContainerDied","Data":"c3720e6494d7da9d6757edc6343c06ba78fe938b5670ae300d8ec58fba2ab684"} Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.535116 4795 scope.go:117] "RemoveContainer" containerID="cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.535253 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmvnt" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.555988 4795 scope.go:117] "RemoveContainer" containerID="0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.562876 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zmvnt"] Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.576874 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zmvnt"] Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.588951 4795 scope.go:117] "RemoveContainer" containerID="c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.609374 4795 scope.go:117] "RemoveContainer" containerID="cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9" Feb 19 21:41:02 crc kubenswrapper[4795]: E0219 21:41:02.611366 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9\": container with ID starting with cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9 not found: ID does not exist" containerID="cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.611406 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9"} err="failed to get container status \"cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9\": rpc error: code = NotFound desc = could not find container \"cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9\": container with ID starting with cc7dddef9a63f7dd6f05656e3a45747c50259dd5340d82e13ccdd95b6ceaa8c9 not found: ID does not exist" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.611435 4795 scope.go:117] "RemoveContainer" containerID="0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6" Feb 19 21:41:02 crc kubenswrapper[4795]: E0219 21:41:02.621287 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6\": container with ID starting with 0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6 not found: ID does not exist" containerID="0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.621333 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6"} err="failed to get container status \"0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6\": rpc error: code = NotFound desc = could not find container \"0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6\": container with ID starting with 0ad256ff7ca3904d283dba61f3dcaf8929555e0b40d744d5939cd6ca169a84e6 not found: ID does not exist" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.621359 4795 scope.go:117] "RemoveContainer" containerID="c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd" Feb 19 21:41:02 crc kubenswrapper[4795]: E0219 21:41:02.625260 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd\": container with ID starting with c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd not found: ID does not exist" containerID="c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd" Feb 19 21:41:02 crc kubenswrapper[4795]: I0219 21:41:02.625304 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd"} err="failed to get container status \"c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd\": rpc error: code = NotFound desc = could not find container \"c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd\": container with ID starting with c429bcd1ab09eff2de0ad552c87f970faf827c32a10744ab563813583f821bcd not found: ID does not exist" Feb 19 21:41:03 crc kubenswrapper[4795]: I0219 21:41:03.520940 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4118207c-5a68-4979-b9b6-eb22b17052b5" path="/var/lib/kubelet/pods/4118207c-5a68-4979-b9b6-eb22b17052b5/volumes" Feb 19 21:41:03 crc kubenswrapper[4795]: I0219 21:41:03.545508 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" event={"ID":"94a7e477-a2bd-4c46-8eb0-084260fade4a","Type":"ContainerStarted","Data":"7ef25b170c2459d47bd81b8c4b15a2f3dcf9f6b31f799843a81af7a10f7025a1"} Feb 19 21:41:07 crc kubenswrapper[4795]: I0219 21:41:07.565338 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" event={"ID":"2eb889b2-1f23-4497-a779-5312fcd470b1","Type":"ContainerStarted","Data":"fd20c7d88200f4018d0640ab4e5f5ed4790e0d8e988c9bba85b4d3915a758139"} Feb 19 21:41:07 crc kubenswrapper[4795]: I0219 21:41:07.565864 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:07 crc kubenswrapper[4795]: I0219 21:41:07.566932 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" event={"ID":"94a7e477-a2bd-4c46-8eb0-084260fade4a","Type":"ContainerStarted","Data":"70a76006e68dfb5df6fddc68cf0e339cb3af018b44bfda881f8ff3833dbac99a"} Feb 19 21:41:07 crc kubenswrapper[4795]: I0219 21:41:07.567152 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:07 crc kubenswrapper[4795]: I0219 21:41:07.588602 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" podStartSLOduration=2.482430395 podStartE2EDuration="6.588584061s" podCreationTimestamp="2026-02-19 21:41:01 +0000 UTC" firstStartedPulling="2026-02-19 21:41:02.491262396 +0000 UTC m=+773.683780260" lastFinishedPulling="2026-02-19 21:41:06.597416062 +0000 UTC m=+777.789933926" observedRunningTime="2026-02-19 21:41:07.585872152 +0000 UTC m=+778.778390056" watchObservedRunningTime="2026-02-19 21:41:07.588584061 +0000 UTC m=+778.781101945" Feb 19 21:41:22 crc kubenswrapper[4795]: I0219 21:41:22.323995 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" Feb 19 21:41:22 crc kubenswrapper[4795]: I0219 21:41:22.353253 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7d8d766c-z8q6x" podStartSLOduration=17.290933457 podStartE2EDuration="21.35322858s" podCreationTimestamp="2026-02-19 21:41:01 +0000 UTC" firstStartedPulling="2026-02-19 21:41:02.547577263 +0000 UTC m=+773.740095127" lastFinishedPulling="2026-02-19 21:41:06.609872386 +0000 UTC m=+777.802390250" observedRunningTime="2026-02-19 21:41:07.608249049 +0000 UTC m=+778.800766913" watchObservedRunningTime="2026-02-19 21:41:22.35322858 +0000 UTC m=+793.545746474" Feb 19 21:41:41 crc kubenswrapper[4795]: I0219 21:41:41.969533 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-549cf7d797-tscrj" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.755197 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm"] Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.755866 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.759382 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-n92rw" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.759536 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.769312 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-b7csh"] Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.772862 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.775456 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.775501 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.817381 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm"] Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.840886 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-kmbww"] Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.841970 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kmbww" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.844475 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.844669 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.844842 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.845234 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-vfjqv" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.873914 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-xrsfh"] Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.874770 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.876575 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.889412 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-xrsfh"] Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.952646 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-reloader\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.952688 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-metrics-certs\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.952981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e32c1521-9c29-4d70-b4bb-54af4127daaf-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-5qtpm\" (UID: \"e32c1521-9c29-4d70-b4bb-54af4127daaf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953042 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-conf\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953092 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-metrics-certs\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953123 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-memberlist\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953182 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-startup\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953209 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-sockets\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953232 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-metrics\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953254 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj6jp\" (UniqueName: \"kubernetes.io/projected/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-kube-api-access-pj6jp\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953283 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq89b\" (UniqueName: \"kubernetes.io/projected/32ed0d55-a2df-4643-9283-e5bc8d1c993e-kube-api-access-bq89b\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953309 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chzkw\" (UniqueName: \"kubernetes.io/projected/e32c1521-9c29-4d70-b4bb-54af4127daaf-kube-api-access-chzkw\") pod \"frr-k8s-webhook-server-78b44bf5bb-5qtpm\" (UID: \"e32c1521-9c29-4d70-b4bb-54af4127daaf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:42 crc kubenswrapper[4795]: I0219 21:41:42.953338 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/32ed0d55-a2df-4643-9283-e5bc8d1c993e-metallb-excludel2\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054017 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-metrics\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054061 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj6jp\" (UniqueName: \"kubernetes.io/projected/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-kube-api-access-pj6jp\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054096 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq89b\" (UniqueName: \"kubernetes.io/projected/32ed0d55-a2df-4643-9283-e5bc8d1c993e-kube-api-access-bq89b\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054115 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chzkw\" (UniqueName: \"kubernetes.io/projected/e32c1521-9c29-4d70-b4bb-54af4127daaf-kube-api-access-chzkw\") pod \"frr-k8s-webhook-server-78b44bf5bb-5qtpm\" (UID: \"e32c1521-9c29-4d70-b4bb-54af4127daaf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054146 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/32ed0d55-a2df-4643-9283-e5bc8d1c993e-metallb-excludel2\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054204 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-reloader\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054226 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-metrics-certs\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054246 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-metrics-certs\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054264 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-cert\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054282 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e32c1521-9c29-4d70-b4bb-54af4127daaf-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-5qtpm\" (UID: \"e32c1521-9c29-4d70-b4bb-54af4127daaf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-conf\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-metrics-certs\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054622 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-memberlist\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054653 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2l5g\" (UniqueName: \"kubernetes.io/projected/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-kube-api-access-h2l5g\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054668 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-metrics\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054698 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-startup\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054751 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-sockets\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054798 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-reloader\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.054925 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/32ed0d55-a2df-4643-9283-e5bc8d1c993e-metallb-excludel2\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.055094 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-sockets\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.055177 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-conf\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: E0219 21:41:43.055190 4795 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 19 21:41:43 crc kubenswrapper[4795]: E0219 21:41:43.055266 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-memberlist podName:32ed0d55-a2df-4643-9283-e5bc8d1c993e nodeName:}" failed. No retries permitted until 2026-02-19 21:41:43.55524524 +0000 UTC m=+814.747763104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-memberlist") pod "speaker-kmbww" (UID: "32ed0d55-a2df-4643-9283-e5bc8d1c993e") : secret "metallb-memberlist" not found Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.055413 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-frr-startup\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.063949 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e32c1521-9c29-4d70-b4bb-54af4127daaf-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-5qtpm\" (UID: \"e32c1521-9c29-4d70-b4bb-54af4127daaf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.064404 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-metrics-certs\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.065616 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-metrics-certs\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.073228 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj6jp\" (UniqueName: \"kubernetes.io/projected/3c79ff86-c25b-45b2-9f84-d33c6264cc0a-kube-api-access-pj6jp\") pod \"frr-k8s-b7csh\" (UID: \"3c79ff86-c25b-45b2-9f84-d33c6264cc0a\") " pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.077896 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chzkw\" (UniqueName: \"kubernetes.io/projected/e32c1521-9c29-4d70-b4bb-54af4127daaf-kube-api-access-chzkw\") pod \"frr-k8s-webhook-server-78b44bf5bb-5qtpm\" (UID: \"e32c1521-9c29-4d70-b4bb-54af4127daaf\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.089052 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.089727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq89b\" (UniqueName: \"kubernetes.io/projected/32ed0d55-a2df-4643-9283-e5bc8d1c993e-kube-api-access-bq89b\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.098089 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.155496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-metrics-certs\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.155548 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-cert\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.155613 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2l5g\" (UniqueName: \"kubernetes.io/projected/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-kube-api-access-h2l5g\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.159128 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-metrics-certs\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.159493 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.172758 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-cert\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.181136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2l5g\" (UniqueName: \"kubernetes.io/projected/1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4-kube-api-access-h2l5g\") pod \"controller-69bbfbf88f-xrsfh\" (UID: \"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4\") " pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.190886 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.336439 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm"] Feb 19 21:41:43 crc kubenswrapper[4795]: W0219 21:41:43.341460 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode32c1521_9c29_4d70_b4bb_54af4127daaf.slice/crio-9047f712377ca3901ebb8e873e69efa04063d1504d4029f5ef3cb0edde012d3a WatchSource:0}: Error finding container 9047f712377ca3901ebb8e873e69efa04063d1504d4029f5ef3cb0edde012d3a: Status 404 returned error can't find the container with id 9047f712377ca3901ebb8e873e69efa04063d1504d4029f5ef3cb0edde012d3a Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.393593 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-xrsfh"] Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.562604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-memberlist\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.567591 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/32ed0d55-a2df-4643-9283-e5bc8d1c993e-memberlist\") pod \"speaker-kmbww\" (UID: \"32ed0d55-a2df-4643-9283-e5bc8d1c993e\") " pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.763718 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kmbww" Feb 19 21:41:43 crc kubenswrapper[4795]: W0219 21:41:43.781535 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32ed0d55_a2df_4643_9283_e5bc8d1c993e.slice/crio-671bd2731e10ef1d917a4416efcf739bbb775ba07351f4f4e615f454b1d45adb WatchSource:0}: Error finding container 671bd2731e10ef1d917a4416efcf739bbb775ba07351f4f4e615f454b1d45adb: Status 404 returned error can't find the container with id 671bd2731e10ef1d917a4416efcf739bbb775ba07351f4f4e615f454b1d45adb Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.793985 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerStarted","Data":"7d3f22e5a3be1edd5e4cce1e6f3e38eda25c44cfa45aa75be523de02578b39be"} Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.795547 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-xrsfh" event={"ID":"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4","Type":"ContainerStarted","Data":"4f41af9995dddda2fcfffe35d4232bad62a2c5f7794bfdbeeccc1aacf7e5ada9"} Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.795571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-xrsfh" event={"ID":"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4","Type":"ContainerStarted","Data":"e0583de77f00d9cff1a8dcebb7c4662615467eeed68c003aba755f57c3480df7"} Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.795581 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-xrsfh" event={"ID":"1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4","Type":"ContainerStarted","Data":"1d8085a3baa07142c8904cbb51aa56d861810608e02e76f8ef21d6469ddf0f48"} Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.795949 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:43 crc kubenswrapper[4795]: I0219 21:41:43.796747 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" event={"ID":"e32c1521-9c29-4d70-b4bb-54af4127daaf","Type":"ContainerStarted","Data":"9047f712377ca3901ebb8e873e69efa04063d1504d4029f5ef3cb0edde012d3a"} Feb 19 21:41:44 crc kubenswrapper[4795]: I0219 21:41:44.820093 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kmbww" event={"ID":"32ed0d55-a2df-4643-9283-e5bc8d1c993e","Type":"ContainerStarted","Data":"0670aa1ec4bddfa996238282d946eb3787421f8b28dd329074767c8497d54509"} Feb 19 21:41:44 crc kubenswrapper[4795]: I0219 21:41:44.820143 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kmbww" event={"ID":"32ed0d55-a2df-4643-9283-e5bc8d1c993e","Type":"ContainerStarted","Data":"791ad1189990959af46b227040aa14ad78fed65e30d9ade491c0f1c1235f68ca"} Feb 19 21:41:44 crc kubenswrapper[4795]: I0219 21:41:44.820158 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kmbww" event={"ID":"32ed0d55-a2df-4643-9283-e5bc8d1c993e","Type":"ContainerStarted","Data":"671bd2731e10ef1d917a4416efcf739bbb775ba07351f4f4e615f454b1d45adb"} Feb 19 21:41:44 crc kubenswrapper[4795]: I0219 21:41:44.820416 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-kmbww" Feb 19 21:41:44 crc kubenswrapper[4795]: I0219 21:41:44.836414 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-xrsfh" podStartSLOduration=2.836394956 podStartE2EDuration="2.836394956s" podCreationTimestamp="2026-02-19 21:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:41:43.820429138 +0000 UTC m=+815.012947072" watchObservedRunningTime="2026-02-19 21:41:44.836394956 +0000 UTC m=+816.028912820" Feb 19 21:41:44 crc kubenswrapper[4795]: I0219 21:41:44.836854 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-kmbww" podStartSLOduration=2.836847339 podStartE2EDuration="2.836847339s" podCreationTimestamp="2026-02-19 21:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:41:44.834013259 +0000 UTC m=+816.026531133" watchObservedRunningTime="2026-02-19 21:41:44.836847339 +0000 UTC m=+816.029365193" Feb 19 21:41:49 crc kubenswrapper[4795]: I0219 21:41:49.858487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerStarted","Data":"d73d829479256056c62fe2ac838de7fd175399aa9f5a3f8e7b6ce485e7ac517c"} Feb 19 21:41:50 crc kubenswrapper[4795]: I0219 21:41:50.866876 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" event={"ID":"e32c1521-9c29-4d70-b4bb-54af4127daaf","Type":"ContainerStarted","Data":"d04c81a2b1179a537195634390212769771129f49fec48d33bca709345343559"} Feb 19 21:41:50 crc kubenswrapper[4795]: I0219 21:41:50.866975 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:41:50 crc kubenswrapper[4795]: I0219 21:41:50.869049 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c79ff86-c25b-45b2-9f84-d33c6264cc0a" containerID="d73d829479256056c62fe2ac838de7fd175399aa9f5a3f8e7b6ce485e7ac517c" exitCode=0 Feb 19 21:41:50 crc kubenswrapper[4795]: I0219 21:41:50.869105 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c79ff86-c25b-45b2-9f84-d33c6264cc0a" containerID="72490e311d8a8f57f4ff25e7a8c8b4561c1442d47f363f83bb4f177d5c9b8a35" exitCode=0 Feb 19 21:41:50 crc kubenswrapper[4795]: I0219 21:41:50.869143 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerDied","Data":"d73d829479256056c62fe2ac838de7fd175399aa9f5a3f8e7b6ce485e7ac517c"} Feb 19 21:41:50 crc kubenswrapper[4795]: I0219 21:41:50.869213 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerDied","Data":"72490e311d8a8f57f4ff25e7a8c8b4561c1442d47f363f83bb4f177d5c9b8a35"} Feb 19 21:41:50 crc kubenswrapper[4795]: I0219 21:41:50.937752 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" podStartSLOduration=2.590851862 podStartE2EDuration="8.937729746s" podCreationTimestamp="2026-02-19 21:41:42 +0000 UTC" firstStartedPulling="2026-02-19 21:41:43.3435216 +0000 UTC m=+814.536039464" lastFinishedPulling="2026-02-19 21:41:49.690399474 +0000 UTC m=+820.882917348" observedRunningTime="2026-02-19 21:41:50.898394104 +0000 UTC m=+822.090911978" watchObservedRunningTime="2026-02-19 21:41:50.937729746 +0000 UTC m=+822.130247640" Feb 19 21:41:51 crc kubenswrapper[4795]: I0219 21:41:51.879995 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c79ff86-c25b-45b2-9f84-d33c6264cc0a" containerID="3b7cad17e35684cde452421d7ca4e7d24377b7d18f62c3a556b036e8654918a2" exitCode=0 Feb 19 21:41:51 crc kubenswrapper[4795]: I0219 21:41:51.880094 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerDied","Data":"3b7cad17e35684cde452421d7ca4e7d24377b7d18f62c3a556b036e8654918a2"} Feb 19 21:41:52 crc kubenswrapper[4795]: I0219 21:41:52.892386 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerStarted","Data":"ef29568524bb229975514ef0788a7abc9a9bde52e2f211668185fbab6225b011"} Feb 19 21:41:52 crc kubenswrapper[4795]: I0219 21:41:52.892689 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerStarted","Data":"63c8caad8691f8ca95be952545c23d51ef9eb1af270f0d0f7f162802ed5cd007"} Feb 19 21:41:52 crc kubenswrapper[4795]: I0219 21:41:52.892700 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerStarted","Data":"deeafe2b27fe4d72012e098dbfc492988ac9174c03f3cf05903e8c59f99be818"} Feb 19 21:41:52 crc kubenswrapper[4795]: I0219 21:41:52.892713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerStarted","Data":"83b40f6c5992832cf7358444e78ce7f4c611860f9d7a246847e859075b139159"} Feb 19 21:41:52 crc kubenswrapper[4795]: I0219 21:41:52.892721 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerStarted","Data":"e7f43fbada166b61bac7c815dce8910ab401ca78b6f4dc559075e40b9f92f0f5"} Feb 19 21:41:53 crc kubenswrapper[4795]: I0219 21:41:53.195229 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-xrsfh" Feb 19 21:41:53 crc kubenswrapper[4795]: I0219 21:41:53.767310 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-kmbww" Feb 19 21:41:53 crc kubenswrapper[4795]: I0219 21:41:53.902473 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b7csh" event={"ID":"3c79ff86-c25b-45b2-9f84-d33c6264cc0a","Type":"ContainerStarted","Data":"543e34a58bf0cba90d368232b34f75d0ecc48f7e93a76053516cff14262fb13d"} Feb 19 21:41:53 crc kubenswrapper[4795]: I0219 21:41:53.902659 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:53 crc kubenswrapper[4795]: I0219 21:41:53.925982 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-b7csh" podStartSLOduration=5.521059606 podStartE2EDuration="11.925964847s" podCreationTimestamp="2026-02-19 21:41:42 +0000 UTC" firstStartedPulling="2026-02-19 21:41:43.268985011 +0000 UTC m=+814.461502875" lastFinishedPulling="2026-02-19 21:41:49.673890252 +0000 UTC m=+820.866408116" observedRunningTime="2026-02-19 21:41:53.923029535 +0000 UTC m=+825.115547419" watchObservedRunningTime="2026-02-19 21:41:53.925964847 +0000 UTC m=+825.118482721" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.148022 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6"] Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.149685 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.152297 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.167374 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6"] Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.199869 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj24f\" (UniqueName: \"kubernetes.io/projected/5aba51a3-e783-497f-b58e-dcd4e631b0e9-kube-api-access-pj24f\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.199986 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.200019 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.301548 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj24f\" (UniqueName: \"kubernetes.io/projected/5aba51a3-e783-497f-b58e-dcd4e631b0e9-kube-api-access-pj24f\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.301641 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.301697 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.302342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.302943 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.329055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj24f\" (UniqueName: \"kubernetes.io/projected/5aba51a3-e783-497f-b58e-dcd4e631b0e9-kube-api-access-pj24f\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.470140 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.671492 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6"] Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.914802 4795 generic.go:334] "Generic (PLEG): container finished" podID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerID="28f884510467f3d0ccbca8dbc1657f80878d0035b8902e2a73fc0ce4be4db940" exitCode=0 Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.914841 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" event={"ID":"5aba51a3-e783-497f-b58e-dcd4e631b0e9","Type":"ContainerDied","Data":"28f884510467f3d0ccbca8dbc1657f80878d0035b8902e2a73fc0ce4be4db940"} Feb 19 21:41:55 crc kubenswrapper[4795]: I0219 21:41:55.914870 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" event={"ID":"5aba51a3-e783-497f-b58e-dcd4e631b0e9","Type":"ContainerStarted","Data":"1fc9a3d11f64d5cef1ac53a01fa7a06ddaf1046b3d7b78ab66ef1091c7d79560"} Feb 19 21:41:58 crc kubenswrapper[4795]: I0219 21:41:58.099091 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:58 crc kubenswrapper[4795]: I0219 21:41:58.149640 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-b7csh" Feb 19 21:41:59 crc kubenswrapper[4795]: I0219 21:41:59.938976 4795 generic.go:334] "Generic (PLEG): container finished" podID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerID="648f73eb6a7a5b928f64dce45c3c576ce679739a989f57b6d95eaedcd3692b3e" exitCode=0 Feb 19 21:41:59 crc kubenswrapper[4795]: I0219 21:41:59.939035 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" event={"ID":"5aba51a3-e783-497f-b58e-dcd4e631b0e9","Type":"ContainerDied","Data":"648f73eb6a7a5b928f64dce45c3c576ce679739a989f57b6d95eaedcd3692b3e"} Feb 19 21:42:00 crc kubenswrapper[4795]: I0219 21:42:00.951304 4795 generic.go:334] "Generic (PLEG): container finished" podID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerID="e3730bdcb19372ac8c0b9369d1d0af34244b2d30d4da4dfcb34d21e8405a0a67" exitCode=0 Feb 19 21:42:00 crc kubenswrapper[4795]: I0219 21:42:00.951581 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" event={"ID":"5aba51a3-e783-497f-b58e-dcd4e631b0e9","Type":"ContainerDied","Data":"e3730bdcb19372ac8c0b9369d1d0af34244b2d30d4da4dfcb34d21e8405a0a67"} Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.210635 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.392860 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-util\") pod \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.392988 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj24f\" (UniqueName: \"kubernetes.io/projected/5aba51a3-e783-497f-b58e-dcd4e631b0e9-kube-api-access-pj24f\") pod \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.393087 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-bundle\") pod \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\" (UID: \"5aba51a3-e783-497f-b58e-dcd4e631b0e9\") " Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.394395 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-bundle" (OuterVolumeSpecName: "bundle") pod "5aba51a3-e783-497f-b58e-dcd4e631b0e9" (UID: "5aba51a3-e783-497f-b58e-dcd4e631b0e9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.403464 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aba51a3-e783-497f-b58e-dcd4e631b0e9-kube-api-access-pj24f" (OuterVolumeSpecName: "kube-api-access-pj24f") pod "5aba51a3-e783-497f-b58e-dcd4e631b0e9" (UID: "5aba51a3-e783-497f-b58e-dcd4e631b0e9"). InnerVolumeSpecName "kube-api-access-pj24f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.406030 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-util" (OuterVolumeSpecName: "util") pod "5aba51a3-e783-497f-b58e-dcd4e631b0e9" (UID: "5aba51a3-e783-497f-b58e-dcd4e631b0e9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.494742 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-util\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.494799 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj24f\" (UniqueName: \"kubernetes.io/projected/5aba51a3-e783-497f-b58e-dcd4e631b0e9-kube-api-access-pj24f\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.494826 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aba51a3-e783-497f-b58e-dcd4e631b0e9-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.965146 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" event={"ID":"5aba51a3-e783-497f-b58e-dcd4e631b0e9","Type":"ContainerDied","Data":"1fc9a3d11f64d5cef1ac53a01fa7a06ddaf1046b3d7b78ab66ef1091c7d79560"} Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.965214 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fc9a3d11f64d5cef1ac53a01fa7a06ddaf1046b3d7b78ab66ef1091c7d79560" Feb 19 21:42:02 crc kubenswrapper[4795]: I0219 21:42:02.965274 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6" Feb 19 21:42:03 crc kubenswrapper[4795]: I0219 21:42:03.094849 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5qtpm" Feb 19 21:42:03 crc kubenswrapper[4795]: I0219 21:42:03.101347 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-b7csh" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.113102 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch"] Feb 19 21:42:07 crc kubenswrapper[4795]: E0219 21:42:07.114214 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerName="pull" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.114240 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerName="pull" Feb 19 21:42:07 crc kubenswrapper[4795]: E0219 21:42:07.114269 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerName="util" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.114282 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerName="util" Feb 19 21:42:07 crc kubenswrapper[4795]: E0219 21:42:07.114305 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerName="extract" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.114320 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerName="extract" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.114526 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aba51a3-e783-497f-b58e-dcd4e631b0e9" containerName="extract" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.116990 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.121720 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.121871 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-wgj97" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.122951 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.126215 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch"] Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.260999 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a132514-cc3c-49a6-9a36-812490cf7ada-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cmmch\" (UID: \"8a132514-cc3c-49a6-9a36-812490cf7ada\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.261076 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsm4x\" (UniqueName: \"kubernetes.io/projected/8a132514-cc3c-49a6-9a36-812490cf7ada-kube-api-access-gsm4x\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cmmch\" (UID: \"8a132514-cc3c-49a6-9a36-812490cf7ada\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.362504 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a132514-cc3c-49a6-9a36-812490cf7ada-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cmmch\" (UID: \"8a132514-cc3c-49a6-9a36-812490cf7ada\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.362621 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsm4x\" (UniqueName: \"kubernetes.io/projected/8a132514-cc3c-49a6-9a36-812490cf7ada-kube-api-access-gsm4x\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cmmch\" (UID: \"8a132514-cc3c-49a6-9a36-812490cf7ada\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.363158 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a132514-cc3c-49a6-9a36-812490cf7ada-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cmmch\" (UID: \"8a132514-cc3c-49a6-9a36-812490cf7ada\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.399348 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsm4x\" (UniqueName: \"kubernetes.io/projected/8a132514-cc3c-49a6-9a36-812490cf7ada-kube-api-access-gsm4x\") pod \"cert-manager-operator-controller-manager-66c8bdd694-cmmch\" (UID: \"8a132514-cc3c-49a6-9a36-812490cf7ada\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.435661 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.761410 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch"] Feb 19 21:42:07 crc kubenswrapper[4795]: W0219 21:42:07.771992 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a132514_cc3c_49a6_9a36_812490cf7ada.slice/crio-905dd52b70620838ccf60fe3a72b39be64ba3db24af3a2f032b6174af096f37b WatchSource:0}: Error finding container 905dd52b70620838ccf60fe3a72b39be64ba3db24af3a2f032b6174af096f37b: Status 404 returned error can't find the container with id 905dd52b70620838ccf60fe3a72b39be64ba3db24af3a2f032b6174af096f37b Feb 19 21:42:07 crc kubenswrapper[4795]: I0219 21:42:07.998791 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" event={"ID":"8a132514-cc3c-49a6-9a36-812490cf7ada","Type":"ContainerStarted","Data":"905dd52b70620838ccf60fe3a72b39be64ba3db24af3a2f032b6174af096f37b"} Feb 19 21:42:13 crc kubenswrapper[4795]: I0219 21:42:13.031646 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" event={"ID":"8a132514-cc3c-49a6-9a36-812490cf7ada","Type":"ContainerStarted","Data":"bef1c724315170992f3b10dbf8664ef1160980726f3f778c97e5120732fbade1"} Feb 19 21:42:13 crc kubenswrapper[4795]: I0219 21:42:13.061197 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-cmmch" podStartSLOduration=1.757070524 podStartE2EDuration="6.061132057s" podCreationTimestamp="2026-02-19 21:42:07 +0000 UTC" firstStartedPulling="2026-02-19 21:42:07.776801557 +0000 UTC m=+838.969319411" lastFinishedPulling="2026-02-19 21:42:12.08086308 +0000 UTC m=+843.273380944" observedRunningTime="2026-02-19 21:42:13.058718199 +0000 UTC m=+844.251236083" watchObservedRunningTime="2026-02-19 21:42:13.061132057 +0000 UTC m=+844.253649951" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.422919 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-bjb5c"] Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.424298 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.426549 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.426701 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.426824 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-m2pm6" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.430019 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-bjb5c"] Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.593779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-bjb5c\" (UID: \"7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee\") " pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.593862 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6gwv\" (UniqueName: \"kubernetes.io/projected/7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee-kube-api-access-f6gwv\") pod \"cert-manager-webhook-6888856db4-bjb5c\" (UID: \"7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee\") " pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.695650 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-bjb5c\" (UID: \"7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee\") " pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.695714 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6gwv\" (UniqueName: \"kubernetes.io/projected/7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee-kube-api-access-f6gwv\") pod \"cert-manager-webhook-6888856db4-bjb5c\" (UID: \"7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee\") " pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.712747 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-bjb5c\" (UID: \"7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee\") " pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.712937 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6gwv\" (UniqueName: \"kubernetes.io/projected/7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee-kube-api-access-f6gwv\") pod \"cert-manager-webhook-6888856db4-bjb5c\" (UID: \"7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee\") " pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:16 crc kubenswrapper[4795]: I0219 21:42:16.742078 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.151617 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-n8skq"] Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.152952 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.154844 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-qsv95" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.200328 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-n8skq"] Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.282077 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-bjb5c"] Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.303906 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35b44919-239d-4fe8-8c53-a3698e24f753-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-n8skq\" (UID: \"35b44919-239d-4fe8-8c53-a3698e24f753\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.304014 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcwwh\" (UniqueName: \"kubernetes.io/projected/35b44919-239d-4fe8-8c53-a3698e24f753-kube-api-access-qcwwh\") pod \"cert-manager-cainjector-5545bd876-n8skq\" (UID: \"35b44919-239d-4fe8-8c53-a3698e24f753\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.405820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcwwh\" (UniqueName: \"kubernetes.io/projected/35b44919-239d-4fe8-8c53-a3698e24f753-kube-api-access-qcwwh\") pod \"cert-manager-cainjector-5545bd876-n8skq\" (UID: \"35b44919-239d-4fe8-8c53-a3698e24f753\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.406191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35b44919-239d-4fe8-8c53-a3698e24f753-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-n8skq\" (UID: \"35b44919-239d-4fe8-8c53-a3698e24f753\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.425202 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35b44919-239d-4fe8-8c53-a3698e24f753-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-n8skq\" (UID: \"35b44919-239d-4fe8-8c53-a3698e24f753\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.425288 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcwwh\" (UniqueName: \"kubernetes.io/projected/35b44919-239d-4fe8-8c53-a3698e24f753-kube-api-access-qcwwh\") pod \"cert-manager-cainjector-5545bd876-n8skq\" (UID: \"35b44919-239d-4fe8-8c53-a3698e24f753\") " pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.506977 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" Feb 19 21:42:17 crc kubenswrapper[4795]: I0219 21:42:17.910569 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-n8skq"] Feb 19 21:42:17 crc kubenswrapper[4795]: W0219 21:42:17.918411 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35b44919_239d_4fe8_8c53_a3698e24f753.slice/crio-29586266cd775b987356a94c8eff4700b242d0b9f8d081d48ec24544abdd234a WatchSource:0}: Error finding container 29586266cd775b987356a94c8eff4700b242d0b9f8d081d48ec24544abdd234a: Status 404 returned error can't find the container with id 29586266cd775b987356a94c8eff4700b242d0b9f8d081d48ec24544abdd234a Feb 19 21:42:18 crc kubenswrapper[4795]: I0219 21:42:18.067997 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" event={"ID":"35b44919-239d-4fe8-8c53-a3698e24f753","Type":"ContainerStarted","Data":"29586266cd775b987356a94c8eff4700b242d0b9f8d081d48ec24544abdd234a"} Feb 19 21:42:18 crc kubenswrapper[4795]: I0219 21:42:18.069036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" event={"ID":"7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee","Type":"ContainerStarted","Data":"2d2c5567b726c1865f57e06bc4c3955b23f9bc923bee9aa2e2c7784c5df2a70a"} Feb 19 21:42:23 crc kubenswrapper[4795]: I0219 21:42:23.101936 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" event={"ID":"35b44919-239d-4fe8-8c53-a3698e24f753","Type":"ContainerStarted","Data":"a6a2727ab8180f0e056de89056ed5844515facd30e362c513559fa8ebd109e6b"} Feb 19 21:42:23 crc kubenswrapper[4795]: I0219 21:42:23.103842 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" event={"ID":"7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee","Type":"ContainerStarted","Data":"7282ad756d00d0ecffbc963eefb288f679126aadd6560469d94b13ab709abf92"} Feb 19 21:42:23 crc kubenswrapper[4795]: I0219 21:42:23.103960 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:23 crc kubenswrapper[4795]: I0219 21:42:23.116620 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-n8skq" podStartSLOduration=2.0653202840000002 podStartE2EDuration="6.116604622s" podCreationTimestamp="2026-02-19 21:42:17 +0000 UTC" firstStartedPulling="2026-02-19 21:42:17.919866508 +0000 UTC m=+849.112384372" lastFinishedPulling="2026-02-19 21:42:21.971150846 +0000 UTC m=+853.163668710" observedRunningTime="2026-02-19 21:42:23.116588862 +0000 UTC m=+854.309106726" watchObservedRunningTime="2026-02-19 21:42:23.116604622 +0000 UTC m=+854.309122486" Feb 19 21:42:23 crc kubenswrapper[4795]: I0219 21:42:23.135738 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" podStartSLOduration=2.451333844 podStartE2EDuration="7.135723168s" podCreationTimestamp="2026-02-19 21:42:16 +0000 UTC" firstStartedPulling="2026-02-19 21:42:17.30025164 +0000 UTC m=+848.492769504" lastFinishedPulling="2026-02-19 21:42:21.984640964 +0000 UTC m=+853.177158828" observedRunningTime="2026-02-19 21:42:23.134111923 +0000 UTC m=+854.326629807" watchObservedRunningTime="2026-02-19 21:42:23.135723168 +0000 UTC m=+854.328241032" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.101260 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-jdbs7"] Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.104824 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-jdbs7" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.106634 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-2zdjc" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.106831 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-jdbs7"] Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.236893 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1df7da5-3926-430a-8085-202bccbc4d73-bound-sa-token\") pod \"cert-manager-545d4d4674-jdbs7\" (UID: \"c1df7da5-3926-430a-8085-202bccbc4d73\") " pod="cert-manager/cert-manager-545d4d4674-jdbs7" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.237009 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2pbw\" (UniqueName: \"kubernetes.io/projected/c1df7da5-3926-430a-8085-202bccbc4d73-kube-api-access-j2pbw\") pod \"cert-manager-545d4d4674-jdbs7\" (UID: \"c1df7da5-3926-430a-8085-202bccbc4d73\") " pod="cert-manager/cert-manager-545d4d4674-jdbs7" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.337993 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1df7da5-3926-430a-8085-202bccbc4d73-bound-sa-token\") pod \"cert-manager-545d4d4674-jdbs7\" (UID: \"c1df7da5-3926-430a-8085-202bccbc4d73\") " pod="cert-manager/cert-manager-545d4d4674-jdbs7" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.338065 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2pbw\" (UniqueName: \"kubernetes.io/projected/c1df7da5-3926-430a-8085-202bccbc4d73-kube-api-access-j2pbw\") pod \"cert-manager-545d4d4674-jdbs7\" (UID: \"c1df7da5-3926-430a-8085-202bccbc4d73\") " pod="cert-manager/cert-manager-545d4d4674-jdbs7" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.357874 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1df7da5-3926-430a-8085-202bccbc4d73-bound-sa-token\") pod \"cert-manager-545d4d4674-jdbs7\" (UID: \"c1df7da5-3926-430a-8085-202bccbc4d73\") " pod="cert-manager/cert-manager-545d4d4674-jdbs7" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.362261 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2pbw\" (UniqueName: \"kubernetes.io/projected/c1df7da5-3926-430a-8085-202bccbc4d73-kube-api-access-j2pbw\") pod \"cert-manager-545d4d4674-jdbs7\" (UID: \"c1df7da5-3926-430a-8085-202bccbc4d73\") " pod="cert-manager/cert-manager-545d4d4674-jdbs7" Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.428260 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-jdbs7" Feb 19 21:42:26 crc kubenswrapper[4795]: W0219 21:42:26.872463 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1df7da5_3926_430a_8085_202bccbc4d73.slice/crio-702f343b7bad680159b3a0adda297d9671b00065b65f3e76618d7dfcdf9c518c WatchSource:0}: Error finding container 702f343b7bad680159b3a0adda297d9671b00065b65f3e76618d7dfcdf9c518c: Status 404 returned error can't find the container with id 702f343b7bad680159b3a0adda297d9671b00065b65f3e76618d7dfcdf9c518c Feb 19 21:42:26 crc kubenswrapper[4795]: I0219 21:42:26.874249 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-jdbs7"] Feb 19 21:42:27 crc kubenswrapper[4795]: I0219 21:42:27.125975 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-jdbs7" event={"ID":"c1df7da5-3926-430a-8085-202bccbc4d73","Type":"ContainerStarted","Data":"eb53a395a459fb62ae7aa8811ebccc7350481995d3e45eeb9820a24fb4916f59"} Feb 19 21:42:27 crc kubenswrapper[4795]: I0219 21:42:27.126334 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-jdbs7" event={"ID":"c1df7da5-3926-430a-8085-202bccbc4d73","Type":"ContainerStarted","Data":"702f343b7bad680159b3a0adda297d9671b00065b65f3e76618d7dfcdf9c518c"} Feb 19 21:42:27 crc kubenswrapper[4795]: I0219 21:42:27.144801 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-jdbs7" podStartSLOduration=1.144784013 podStartE2EDuration="1.144784013s" podCreationTimestamp="2026-02-19 21:42:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:42:27.140656747 +0000 UTC m=+858.333174611" watchObservedRunningTime="2026-02-19 21:42:27.144784013 +0000 UTC m=+858.337301877" Feb 19 21:42:31 crc kubenswrapper[4795]: I0219 21:42:31.745780 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-bjb5c" Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.262345 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-z5tz7"] Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.263348 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z5tz7" Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.265758 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.265994 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-tr24d" Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.266203 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.283747 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z5tz7"] Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.361120 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8pjb\" (UniqueName: \"kubernetes.io/projected/4eeae4db-e5c7-4179-9040-91ebfbc5d48a-kube-api-access-q8pjb\") pod \"openstack-operator-index-z5tz7\" (UID: \"4eeae4db-e5c7-4179-9040-91ebfbc5d48a\") " pod="openstack-operators/openstack-operator-index-z5tz7" Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.462676 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8pjb\" (UniqueName: \"kubernetes.io/projected/4eeae4db-e5c7-4179-9040-91ebfbc5d48a-kube-api-access-q8pjb\") pod \"openstack-operator-index-z5tz7\" (UID: \"4eeae4db-e5c7-4179-9040-91ebfbc5d48a\") " pod="openstack-operators/openstack-operator-index-z5tz7" Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.497908 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8pjb\" (UniqueName: \"kubernetes.io/projected/4eeae4db-e5c7-4179-9040-91ebfbc5d48a-kube-api-access-q8pjb\") pod \"openstack-operator-index-z5tz7\" (UID: \"4eeae4db-e5c7-4179-9040-91ebfbc5d48a\") " pod="openstack-operators/openstack-operator-index-z5tz7" Feb 19 21:42:35 crc kubenswrapper[4795]: I0219 21:42:35.598669 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z5tz7" Feb 19 21:42:36 crc kubenswrapper[4795]: I0219 21:42:36.039995 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z5tz7"] Feb 19 21:42:36 crc kubenswrapper[4795]: W0219 21:42:36.043876 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4eeae4db_e5c7_4179_9040_91ebfbc5d48a.slice/crio-4fa36661f9191c592cdf1314b296aecb5ba6000cbbc3fd46efdaf12a69e8dc11 WatchSource:0}: Error finding container 4fa36661f9191c592cdf1314b296aecb5ba6000cbbc3fd46efdaf12a69e8dc11: Status 404 returned error can't find the container with id 4fa36661f9191c592cdf1314b296aecb5ba6000cbbc3fd46efdaf12a69e8dc11 Feb 19 21:42:36 crc kubenswrapper[4795]: I0219 21:42:36.180111 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z5tz7" event={"ID":"4eeae4db-e5c7-4179-9040-91ebfbc5d48a","Type":"ContainerStarted","Data":"4fa36661f9191c592cdf1314b296aecb5ba6000cbbc3fd46efdaf12a69e8dc11"} Feb 19 21:42:37 crc kubenswrapper[4795]: I0219 21:42:37.188557 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z5tz7" event={"ID":"4eeae4db-e5c7-4179-9040-91ebfbc5d48a","Type":"ContainerStarted","Data":"0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab"} Feb 19 21:42:37 crc kubenswrapper[4795]: I0219 21:42:37.209428 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-z5tz7" podStartSLOduration=1.4712887559999999 podStartE2EDuration="2.209396814s" podCreationTimestamp="2026-02-19 21:42:35 +0000 UTC" firstStartedPulling="2026-02-19 21:42:36.046234902 +0000 UTC m=+867.238752766" lastFinishedPulling="2026-02-19 21:42:36.78434296 +0000 UTC m=+867.976860824" observedRunningTime="2026-02-19 21:42:37.202778849 +0000 UTC m=+868.395296743" watchObservedRunningTime="2026-02-19 21:42:37.209396814 +0000 UTC m=+868.401914718" Feb 19 21:42:38 crc kubenswrapper[4795]: I0219 21:42:38.437613 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-z5tz7"] Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.046143 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-tf75g"] Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.046987 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.058924 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tf75g"] Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.200190 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-z5tz7" podUID="4eeae4db-e5c7-4179-9040-91ebfbc5d48a" containerName="registry-server" containerID="cri-o://0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab" gracePeriod=2 Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.212244 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfhts\" (UniqueName: \"kubernetes.io/projected/91c93ffc-fbe2-486e-92a9-ca5737dc7875-kube-api-access-gfhts\") pod \"openstack-operator-index-tf75g\" (UID: \"91c93ffc-fbe2-486e-92a9-ca5737dc7875\") " pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.313745 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhts\" (UniqueName: \"kubernetes.io/projected/91c93ffc-fbe2-486e-92a9-ca5737dc7875-kube-api-access-gfhts\") pod \"openstack-operator-index-tf75g\" (UID: \"91c93ffc-fbe2-486e-92a9-ca5737dc7875\") " pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.334308 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfhts\" (UniqueName: \"kubernetes.io/projected/91c93ffc-fbe2-486e-92a9-ca5737dc7875-kube-api-access-gfhts\") pod \"openstack-operator-index-tf75g\" (UID: \"91c93ffc-fbe2-486e-92a9-ca5737dc7875\") " pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.367602 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.565966 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z5tz7" Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.598601 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tf75g"] Feb 19 21:42:39 crc kubenswrapper[4795]: W0219 21:42:39.606305 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91c93ffc_fbe2_486e_92a9_ca5737dc7875.slice/crio-9a27a550b86db7057ede89e67f90298f9c043451704d647ab661cf98355e4a0c WatchSource:0}: Error finding container 9a27a550b86db7057ede89e67f90298f9c043451704d647ab661cf98355e4a0c: Status 404 returned error can't find the container with id 9a27a550b86db7057ede89e67f90298f9c043451704d647ab661cf98355e4a0c Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.725007 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8pjb\" (UniqueName: \"kubernetes.io/projected/4eeae4db-e5c7-4179-9040-91ebfbc5d48a-kube-api-access-q8pjb\") pod \"4eeae4db-e5c7-4179-9040-91ebfbc5d48a\" (UID: \"4eeae4db-e5c7-4179-9040-91ebfbc5d48a\") " Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.729451 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eeae4db-e5c7-4179-9040-91ebfbc5d48a-kube-api-access-q8pjb" (OuterVolumeSpecName: "kube-api-access-q8pjb") pod "4eeae4db-e5c7-4179-9040-91ebfbc5d48a" (UID: "4eeae4db-e5c7-4179-9040-91ebfbc5d48a"). InnerVolumeSpecName "kube-api-access-q8pjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:42:39 crc kubenswrapper[4795]: I0219 21:42:39.826785 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8pjb\" (UniqueName: \"kubernetes.io/projected/4eeae4db-e5c7-4179-9040-91ebfbc5d48a-kube-api-access-q8pjb\") on node \"crc\" DevicePath \"\"" Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.206229 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tf75g" event={"ID":"91c93ffc-fbe2-486e-92a9-ca5737dc7875","Type":"ContainerStarted","Data":"90e6c28c48f3f75fa44a5da70c3c91644365c580e2b9079af48fa261adedb09b"} Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.206651 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tf75g" event={"ID":"91c93ffc-fbe2-486e-92a9-ca5737dc7875","Type":"ContainerStarted","Data":"9a27a550b86db7057ede89e67f90298f9c043451704d647ab661cf98355e4a0c"} Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.207612 4795 generic.go:334] "Generic (PLEG): container finished" podID="4eeae4db-e5c7-4179-9040-91ebfbc5d48a" containerID="0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab" exitCode=0 Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.207647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z5tz7" event={"ID":"4eeae4db-e5c7-4179-9040-91ebfbc5d48a","Type":"ContainerDied","Data":"0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab"} Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.207669 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z5tz7" event={"ID":"4eeae4db-e5c7-4179-9040-91ebfbc5d48a","Type":"ContainerDied","Data":"4fa36661f9191c592cdf1314b296aecb5ba6000cbbc3fd46efdaf12a69e8dc11"} Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.207683 4795 scope.go:117] "RemoveContainer" containerID="0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab" Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.207770 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z5tz7" Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.227097 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-tf75g" podStartSLOduration=0.835997599 podStartE2EDuration="1.227073731s" podCreationTimestamp="2026-02-19 21:42:39 +0000 UTC" firstStartedPulling="2026-02-19 21:42:39.610388625 +0000 UTC m=+870.802906489" lastFinishedPulling="2026-02-19 21:42:40.001464757 +0000 UTC m=+871.193982621" observedRunningTime="2026-02-19 21:42:40.219044706 +0000 UTC m=+871.411562570" watchObservedRunningTime="2026-02-19 21:42:40.227073731 +0000 UTC m=+871.419591595" Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.239828 4795 scope.go:117] "RemoveContainer" containerID="0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab" Feb 19 21:42:40 crc kubenswrapper[4795]: E0219 21:42:40.244138 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab\": container with ID starting with 0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab not found: ID does not exist" containerID="0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab" Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.244222 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab"} err="failed to get container status \"0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab\": rpc error: code = NotFound desc = could not find container \"0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab\": container with ID starting with 0f8c0ef1817047e299fdb098c5364254a2f8c054905906ee44025dbb49f6a8ab not found: ID does not exist" Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.251549 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-z5tz7"] Feb 19 21:42:40 crc kubenswrapper[4795]: I0219 21:42:40.257069 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-z5tz7"] Feb 19 21:42:41 crc kubenswrapper[4795]: I0219 21:42:41.518917 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eeae4db-e5c7-4179-9040-91ebfbc5d48a" path="/var/lib/kubelet/pods/4eeae4db-e5c7-4179-9040-91ebfbc5d48a/volumes" Feb 19 21:42:49 crc kubenswrapper[4795]: I0219 21:42:49.368633 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:49 crc kubenswrapper[4795]: I0219 21:42:49.369419 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:49 crc kubenswrapper[4795]: I0219 21:42:49.405678 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:50 crc kubenswrapper[4795]: I0219 21:42:50.294985 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-tf75g" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.303598 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4"] Feb 19 21:42:56 crc kubenswrapper[4795]: E0219 21:42:56.304449 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eeae4db-e5c7-4179-9040-91ebfbc5d48a" containerName="registry-server" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.304464 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eeae4db-e5c7-4179-9040-91ebfbc5d48a" containerName="registry-server" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.304632 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eeae4db-e5c7-4179-9040-91ebfbc5d48a" containerName="registry-server" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.305619 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.319843 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ldffb" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.328168 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4"] Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.463145 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.463210 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.463236 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdd2d\" (UniqueName: \"kubernetes.io/projected/6afab948-ae77-464b-aa33-b8d45ddc01ff-kube-api-access-mdd2d\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.563991 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.564327 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.564453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdd2d\" (UniqueName: \"kubernetes.io/projected/6afab948-ae77-464b-aa33-b8d45ddc01ff-kube-api-access-mdd2d\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.564721 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.564763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.586496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdd2d\" (UniqueName: \"kubernetes.io/projected/6afab948-ae77-464b-aa33-b8d45ddc01ff-kube-api-access-mdd2d\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.690072 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:42:56 crc kubenswrapper[4795]: I0219 21:42:56.856865 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4"] Feb 19 21:42:56 crc kubenswrapper[4795]: W0219 21:42:56.864875 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6afab948_ae77_464b_aa33_b8d45ddc01ff.slice/crio-124421aeefdb63a302c2bb5e887c611741297446fc6e7c08469c3faa3b2571c7 WatchSource:0}: Error finding container 124421aeefdb63a302c2bb5e887c611741297446fc6e7c08469c3faa3b2571c7: Status 404 returned error can't find the container with id 124421aeefdb63a302c2bb5e887c611741297446fc6e7c08469c3faa3b2571c7 Feb 19 21:42:57 crc kubenswrapper[4795]: I0219 21:42:57.317806 4795 generic.go:334] "Generic (PLEG): container finished" podID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerID="560fa89ed64d4fb8a3f7191bc9e26dfc07616a4bde2cd7a679b6b0b56a68cd4f" exitCode=0 Feb 19 21:42:57 crc kubenswrapper[4795]: I0219 21:42:57.317897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" event={"ID":"6afab948-ae77-464b-aa33-b8d45ddc01ff","Type":"ContainerDied","Data":"560fa89ed64d4fb8a3f7191bc9e26dfc07616a4bde2cd7a679b6b0b56a68cd4f"} Feb 19 21:42:57 crc kubenswrapper[4795]: I0219 21:42:57.318055 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" event={"ID":"6afab948-ae77-464b-aa33-b8d45ddc01ff","Type":"ContainerStarted","Data":"124421aeefdb63a302c2bb5e887c611741297446fc6e7c08469c3faa3b2571c7"} Feb 19 21:42:58 crc kubenswrapper[4795]: I0219 21:42:58.336199 4795 generic.go:334] "Generic (PLEG): container finished" podID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerID="bd54c9335ce2f7069bfa894ae6efb0cce1568572f89ec25fa875d2cd76665ac2" exitCode=0 Feb 19 21:42:58 crc kubenswrapper[4795]: I0219 21:42:58.336298 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" event={"ID":"6afab948-ae77-464b-aa33-b8d45ddc01ff","Type":"ContainerDied","Data":"bd54c9335ce2f7069bfa894ae6efb0cce1568572f89ec25fa875d2cd76665ac2"} Feb 19 21:42:58 crc kubenswrapper[4795]: I0219 21:42:58.428631 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:42:58 crc kubenswrapper[4795]: I0219 21:42:58.428699 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:42:59 crc kubenswrapper[4795]: I0219 21:42:59.345683 4795 generic.go:334] "Generic (PLEG): container finished" podID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerID="5a8e78242dbb62fb0a41bc624cecf9454d7d079f6f6576ad6349df0d475f901d" exitCode=0 Feb 19 21:42:59 crc kubenswrapper[4795]: I0219 21:42:59.345728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" event={"ID":"6afab948-ae77-464b-aa33-b8d45ddc01ff","Type":"ContainerDied","Data":"5a8e78242dbb62fb0a41bc624cecf9454d7d079f6f6576ad6349df0d475f901d"} Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.720244 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.822069 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-util\") pod \"6afab948-ae77-464b-aa33-b8d45ddc01ff\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.822132 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-bundle\") pod \"6afab948-ae77-464b-aa33-b8d45ddc01ff\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.822214 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdd2d\" (UniqueName: \"kubernetes.io/projected/6afab948-ae77-464b-aa33-b8d45ddc01ff-kube-api-access-mdd2d\") pod \"6afab948-ae77-464b-aa33-b8d45ddc01ff\" (UID: \"6afab948-ae77-464b-aa33-b8d45ddc01ff\") " Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.823478 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-bundle" (OuterVolumeSpecName: "bundle") pod "6afab948-ae77-464b-aa33-b8d45ddc01ff" (UID: "6afab948-ae77-464b-aa33-b8d45ddc01ff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.826976 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6afab948-ae77-464b-aa33-b8d45ddc01ff-kube-api-access-mdd2d" (OuterVolumeSpecName: "kube-api-access-mdd2d") pod "6afab948-ae77-464b-aa33-b8d45ddc01ff" (UID: "6afab948-ae77-464b-aa33-b8d45ddc01ff"). InnerVolumeSpecName "kube-api-access-mdd2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.840462 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-util" (OuterVolumeSpecName: "util") pod "6afab948-ae77-464b-aa33-b8d45ddc01ff" (UID: "6afab948-ae77-464b-aa33-b8d45ddc01ff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.924057 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-util\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.924366 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6afab948-ae77-464b-aa33-b8d45ddc01ff-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:00 crc kubenswrapper[4795]: I0219 21:43:00.924452 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdd2d\" (UniqueName: \"kubernetes.io/projected/6afab948-ae77-464b-aa33-b8d45ddc01ff-kube-api-access-mdd2d\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:01 crc kubenswrapper[4795]: I0219 21:43:01.367379 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" event={"ID":"6afab948-ae77-464b-aa33-b8d45ddc01ff","Type":"ContainerDied","Data":"124421aeefdb63a302c2bb5e887c611741297446fc6e7c08469c3faa3b2571c7"} Feb 19 21:43:01 crc kubenswrapper[4795]: I0219 21:43:01.367417 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="124421aeefdb63a302c2bb5e887c611741297446fc6e7c08469c3faa3b2571c7" Feb 19 21:43:01 crc kubenswrapper[4795]: I0219 21:43:01.368141 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.001062 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq"] Feb 19 21:43:04 crc kubenswrapper[4795]: E0219 21:43:04.001681 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerName="extract" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.001697 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerName="extract" Feb 19 21:43:04 crc kubenswrapper[4795]: E0219 21:43:04.001720 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerName="pull" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.001727 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerName="pull" Feb 19 21:43:04 crc kubenswrapper[4795]: E0219 21:43:04.001746 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerName="util" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.001754 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerName="util" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.001884 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6afab948-ae77-464b-aa33-b8d45ddc01ff" containerName="extract" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.002411 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.004645 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-l2pgb" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.021630 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq"] Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.063103 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr9mb\" (UniqueName: \"kubernetes.io/projected/c6c44d2f-3e8f-42de-babe-85a8fc1a97ec-kube-api-access-sr9mb\") pod \"openstack-operator-controller-init-6679bf9b57-vbjcq\" (UID: \"c6c44d2f-3e8f-42de-babe-85a8fc1a97ec\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.164708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr9mb\" (UniqueName: \"kubernetes.io/projected/c6c44d2f-3e8f-42de-babe-85a8fc1a97ec-kube-api-access-sr9mb\") pod \"openstack-operator-controller-init-6679bf9b57-vbjcq\" (UID: \"c6c44d2f-3e8f-42de-babe-85a8fc1a97ec\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.181735 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr9mb\" (UniqueName: \"kubernetes.io/projected/c6c44d2f-3e8f-42de-babe-85a8fc1a97ec-kube-api-access-sr9mb\") pod \"openstack-operator-controller-init-6679bf9b57-vbjcq\" (UID: \"c6c44d2f-3e8f-42de-babe-85a8fc1a97ec\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.317378 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" Feb 19 21:43:04 crc kubenswrapper[4795]: I0219 21:43:04.565093 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq"] Feb 19 21:43:05 crc kubenswrapper[4795]: I0219 21:43:05.394571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" event={"ID":"c6c44d2f-3e8f-42de-babe-85a8fc1a97ec","Type":"ContainerStarted","Data":"975d1aa22e4e6bad437d6e96741fac9388db1f4391542e732131ee7f49cfc185"} Feb 19 21:43:09 crc kubenswrapper[4795]: I0219 21:43:09.430902 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" event={"ID":"c6c44d2f-3e8f-42de-babe-85a8fc1a97ec","Type":"ContainerStarted","Data":"91ce1c70e01e501ae90013294415d0b3a65ffe27bcbf989054ebea39c295ed52"} Feb 19 21:43:09 crc kubenswrapper[4795]: I0219 21:43:09.431531 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" Feb 19 21:43:09 crc kubenswrapper[4795]: I0219 21:43:09.469208 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" podStartSLOduration=2.638696849 podStartE2EDuration="6.469188668s" podCreationTimestamp="2026-02-19 21:43:03 +0000 UTC" firstStartedPulling="2026-02-19 21:43:04.575232891 +0000 UTC m=+895.767750755" lastFinishedPulling="2026-02-19 21:43:08.40572471 +0000 UTC m=+899.598242574" observedRunningTime="2026-02-19 21:43:09.464647161 +0000 UTC m=+900.657165055" watchObservedRunningTime="2026-02-19 21:43:09.469188668 +0000 UTC m=+900.661706542" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.528477 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-84vzh"] Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.530056 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.555336 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84vzh"] Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.585797 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-catalog-content\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.586203 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-utilities\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.586300 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6595\" (UniqueName: \"kubernetes.io/projected/ceeabb16-8075-4c75-8d79-b49b92451b81-kube-api-access-d6595\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.687816 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-utilities\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.688079 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6595\" (UniqueName: \"kubernetes.io/projected/ceeabb16-8075-4c75-8d79-b49b92451b81-kube-api-access-d6595\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.688245 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-catalog-content\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.688407 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-utilities\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.688574 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-catalog-content\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.708322 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6595\" (UniqueName: \"kubernetes.io/projected/ceeabb16-8075-4c75-8d79-b49b92451b81-kube-api-access-d6595\") pod \"community-operators-84vzh\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:13 crc kubenswrapper[4795]: I0219 21:43:13.864639 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:14 crc kubenswrapper[4795]: I0219 21:43:14.102465 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84vzh"] Feb 19 21:43:14 crc kubenswrapper[4795]: I0219 21:43:14.319487 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-vbjcq" Feb 19 21:43:14 crc kubenswrapper[4795]: I0219 21:43:14.464302 4795 generic.go:334] "Generic (PLEG): container finished" podID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerID="b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737" exitCode=0 Feb 19 21:43:14 crc kubenswrapper[4795]: I0219 21:43:14.464356 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vzh" event={"ID":"ceeabb16-8075-4c75-8d79-b49b92451b81","Type":"ContainerDied","Data":"b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737"} Feb 19 21:43:14 crc kubenswrapper[4795]: I0219 21:43:14.464408 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vzh" event={"ID":"ceeabb16-8075-4c75-8d79-b49b92451b81","Type":"ContainerStarted","Data":"5e0099795ccfe2d93b8fa7fc956a19af072f61fe6c2e7aef96555ee34be37e44"} Feb 19 21:43:15 crc kubenswrapper[4795]: I0219 21:43:15.472691 4795 generic.go:334] "Generic (PLEG): container finished" podID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerID="55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f" exitCode=0 Feb 19 21:43:15 crc kubenswrapper[4795]: I0219 21:43:15.472774 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vzh" event={"ID":"ceeabb16-8075-4c75-8d79-b49b92451b81","Type":"ContainerDied","Data":"55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f"} Feb 19 21:43:16 crc kubenswrapper[4795]: I0219 21:43:16.480521 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vzh" event={"ID":"ceeabb16-8075-4c75-8d79-b49b92451b81","Type":"ContainerStarted","Data":"5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487"} Feb 19 21:43:16 crc kubenswrapper[4795]: I0219 21:43:16.508273 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-84vzh" podStartSLOduration=2.153926838 podStartE2EDuration="3.50824638s" podCreationTimestamp="2026-02-19 21:43:13 +0000 UTC" firstStartedPulling="2026-02-19 21:43:14.465640369 +0000 UTC m=+905.658158233" lastFinishedPulling="2026-02-19 21:43:15.819959911 +0000 UTC m=+907.012477775" observedRunningTime="2026-02-19 21:43:16.50081186 +0000 UTC m=+907.693329734" watchObservedRunningTime="2026-02-19 21:43:16.50824638 +0000 UTC m=+907.700764284" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.342921 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-df67c"] Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.345055 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.373110 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-df67c"] Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.464309 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfg7g\" (UniqueName: \"kubernetes.io/projected/26c9b930-41fe-4332-9fde-8c9d4cb304bd-kube-api-access-hfg7g\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.464361 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-catalog-content\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.464382 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-utilities\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.565404 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfg7g\" (UniqueName: \"kubernetes.io/projected/26c9b930-41fe-4332-9fde-8c9d4cb304bd-kube-api-access-hfg7g\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.565468 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-catalog-content\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.565492 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-utilities\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.567151 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-catalog-content\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.567390 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-utilities\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.583751 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfg7g\" (UniqueName: \"kubernetes.io/projected/26c9b930-41fe-4332-9fde-8c9d4cb304bd-kube-api-access-hfg7g\") pod \"certified-operators-df67c\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:19 crc kubenswrapper[4795]: I0219 21:43:19.668004 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:20 crc kubenswrapper[4795]: I0219 21:43:20.082536 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-df67c"] Feb 19 21:43:20 crc kubenswrapper[4795]: E0219 21:43:20.332574 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26c9b930_41fe_4332_9fde_8c9d4cb304bd.slice/crio-aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61.scope\": RecentStats: unable to find data in memory cache]" Feb 19 21:43:20 crc kubenswrapper[4795]: I0219 21:43:20.506688 4795 generic.go:334] "Generic (PLEG): container finished" podID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerID="aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61" exitCode=0 Feb 19 21:43:20 crc kubenswrapper[4795]: I0219 21:43:20.506749 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df67c" event={"ID":"26c9b930-41fe-4332-9fde-8c9d4cb304bd","Type":"ContainerDied","Data":"aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61"} Feb 19 21:43:20 crc kubenswrapper[4795]: I0219 21:43:20.506815 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df67c" event={"ID":"26c9b930-41fe-4332-9fde-8c9d4cb304bd","Type":"ContainerStarted","Data":"8e2862dfbf3b5bd22726e0781fe62e4bc152e6dd0506beef8a94a47634a3f65a"} Feb 19 21:43:21 crc kubenswrapper[4795]: I0219 21:43:21.521064 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df67c" event={"ID":"26c9b930-41fe-4332-9fde-8c9d4cb304bd","Type":"ContainerStarted","Data":"673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8"} Feb 19 21:43:22 crc kubenswrapper[4795]: I0219 21:43:22.523817 4795 generic.go:334] "Generic (PLEG): container finished" podID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerID="673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8" exitCode=0 Feb 19 21:43:22 crc kubenswrapper[4795]: I0219 21:43:22.523871 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df67c" event={"ID":"26c9b930-41fe-4332-9fde-8c9d4cb304bd","Type":"ContainerDied","Data":"673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8"} Feb 19 21:43:23 crc kubenswrapper[4795]: I0219 21:43:23.530767 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df67c" event={"ID":"26c9b930-41fe-4332-9fde-8c9d4cb304bd","Type":"ContainerStarted","Data":"3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d"} Feb 19 21:43:23 crc kubenswrapper[4795]: I0219 21:43:23.866004 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:23 crc kubenswrapper[4795]: I0219 21:43:23.866069 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:23 crc kubenswrapper[4795]: I0219 21:43:23.915701 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:23 crc kubenswrapper[4795]: I0219 21:43:23.932978 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-df67c" podStartSLOduration=2.4934207 podStartE2EDuration="4.932964063s" podCreationTimestamp="2026-02-19 21:43:19 +0000 UTC" firstStartedPulling="2026-02-19 21:43:20.508081929 +0000 UTC m=+911.700599793" lastFinishedPulling="2026-02-19 21:43:22.947625292 +0000 UTC m=+914.140143156" observedRunningTime="2026-02-19 21:43:23.554506089 +0000 UTC m=+914.747023963" watchObservedRunningTime="2026-02-19 21:43:23.932964063 +0000 UTC m=+915.125481927" Feb 19 21:43:24 crc kubenswrapper[4795]: I0219 21:43:24.616050 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:26 crc kubenswrapper[4795]: I0219 21:43:26.329352 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84vzh"] Feb 19 21:43:26 crc kubenswrapper[4795]: I0219 21:43:26.547981 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-84vzh" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerName="registry-server" containerID="cri-o://5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487" gracePeriod=2 Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.490889 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.554834 4795 generic.go:334] "Generic (PLEG): container finished" podID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerID="5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487" exitCode=0 Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.554873 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vzh" event={"ID":"ceeabb16-8075-4c75-8d79-b49b92451b81","Type":"ContainerDied","Data":"5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487"} Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.554894 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84vzh" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.554908 4795 scope.go:117] "RemoveContainer" containerID="5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.554897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84vzh" event={"ID":"ceeabb16-8075-4c75-8d79-b49b92451b81","Type":"ContainerDied","Data":"5e0099795ccfe2d93b8fa7fc956a19af072f61fe6c2e7aef96555ee34be37e44"} Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.566321 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-utilities\") pod \"ceeabb16-8075-4c75-8d79-b49b92451b81\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.566412 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-catalog-content\") pod \"ceeabb16-8075-4c75-8d79-b49b92451b81\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.566464 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6595\" (UniqueName: \"kubernetes.io/projected/ceeabb16-8075-4c75-8d79-b49b92451b81-kube-api-access-d6595\") pod \"ceeabb16-8075-4c75-8d79-b49b92451b81\" (UID: \"ceeabb16-8075-4c75-8d79-b49b92451b81\") " Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.567850 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-utilities" (OuterVolumeSpecName: "utilities") pod "ceeabb16-8075-4c75-8d79-b49b92451b81" (UID: "ceeabb16-8075-4c75-8d79-b49b92451b81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.568972 4795 scope.go:117] "RemoveContainer" containerID="55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.584531 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceeabb16-8075-4c75-8d79-b49b92451b81-kube-api-access-d6595" (OuterVolumeSpecName: "kube-api-access-d6595") pod "ceeabb16-8075-4c75-8d79-b49b92451b81" (UID: "ceeabb16-8075-4c75-8d79-b49b92451b81"). InnerVolumeSpecName "kube-api-access-d6595". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.603024 4795 scope.go:117] "RemoveContainer" containerID="b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.620986 4795 scope.go:117] "RemoveContainer" containerID="5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487" Feb 19 21:43:27 crc kubenswrapper[4795]: E0219 21:43:27.621439 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487\": container with ID starting with 5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487 not found: ID does not exist" containerID="5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.621477 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487"} err="failed to get container status \"5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487\": rpc error: code = NotFound desc = could not find container \"5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487\": container with ID starting with 5e2716d65439411b2778543fac0f0156337db9e7c7c3213d7217d4f0ef1d5487 not found: ID does not exist" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.621499 4795 scope.go:117] "RemoveContainer" containerID="55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f" Feb 19 21:43:27 crc kubenswrapper[4795]: E0219 21:43:27.621897 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f\": container with ID starting with 55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f not found: ID does not exist" containerID="55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.621938 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f"} err="failed to get container status \"55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f\": rpc error: code = NotFound desc = could not find container \"55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f\": container with ID starting with 55121af7a869535898dd3f8c014efea12f70c43c4fa01f772dee8b347918943f not found: ID does not exist" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.621966 4795 scope.go:117] "RemoveContainer" containerID="b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737" Feb 19 21:43:27 crc kubenswrapper[4795]: E0219 21:43:27.622323 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737\": container with ID starting with b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737 not found: ID does not exist" containerID="b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.622369 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737"} err="failed to get container status \"b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737\": rpc error: code = NotFound desc = could not find container \"b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737\": container with ID starting with b84390b5874e7664aed5e9062db897ad8d52dd572518db2386862aac9d669737 not found: ID does not exist" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.623732 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ceeabb16-8075-4c75-8d79-b49b92451b81" (UID: "ceeabb16-8075-4c75-8d79-b49b92451b81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.668230 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6595\" (UniqueName: \"kubernetes.io/projected/ceeabb16-8075-4c75-8d79-b49b92451b81-kube-api-access-d6595\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.668262 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.668271 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ceeabb16-8075-4c75-8d79-b49b92451b81-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.877949 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84vzh"] Feb 19 21:43:27 crc kubenswrapper[4795]: I0219 21:43:27.882228 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-84vzh"] Feb 19 21:43:28 crc kubenswrapper[4795]: I0219 21:43:28.427512 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:43:28 crc kubenswrapper[4795]: I0219 21:43:28.427570 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:43:29 crc kubenswrapper[4795]: I0219 21:43:29.519035 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" path="/var/lib/kubelet/pods/ceeabb16-8075-4c75-8d79-b49b92451b81/volumes" Feb 19 21:43:29 crc kubenswrapper[4795]: I0219 21:43:29.668298 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:29 crc kubenswrapper[4795]: I0219 21:43:29.668416 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:29 crc kubenswrapper[4795]: I0219 21:43:29.711689 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:30 crc kubenswrapper[4795]: I0219 21:43:30.637073 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:31 crc kubenswrapper[4795]: I0219 21:43:31.732487 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-df67c"] Feb 19 21:43:33 crc kubenswrapper[4795]: I0219 21:43:33.586875 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-df67c" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerName="registry-server" containerID="cri-o://3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d" gracePeriod=2 Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.000067 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.051278 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-catalog-content\") pod \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.051471 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-utilities\") pod \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.051520 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfg7g\" (UniqueName: \"kubernetes.io/projected/26c9b930-41fe-4332-9fde-8c9d4cb304bd-kube-api-access-hfg7g\") pod \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\" (UID: \"26c9b930-41fe-4332-9fde-8c9d4cb304bd\") " Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.052758 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-utilities" (OuterVolumeSpecName: "utilities") pod "26c9b930-41fe-4332-9fde-8c9d4cb304bd" (UID: "26c9b930-41fe-4332-9fde-8c9d4cb304bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.056794 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c9b930-41fe-4332-9fde-8c9d4cb304bd-kube-api-access-hfg7g" (OuterVolumeSpecName: "kube-api-access-hfg7g") pod "26c9b930-41fe-4332-9fde-8c9d4cb304bd" (UID: "26c9b930-41fe-4332-9fde-8c9d4cb304bd"). InnerVolumeSpecName "kube-api-access-hfg7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.124757 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm"] Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.125120 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerName="extract-content" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.125143 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerName="extract-content" Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.125158 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerName="extract-utilities" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.125183 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerName="extract-utilities" Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.125206 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerName="extract-content" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.125214 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerName="extract-content" Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.125225 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerName="registry-server" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.125232 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerName="registry-server" Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.125245 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerName="registry-server" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.125254 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerName="registry-server" Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.125269 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerName="extract-utilities" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.125278 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerName="extract-utilities" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.125415 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerName="registry-server" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.125441 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceeabb16-8075-4c75-8d79-b49b92451b81" containerName="registry-server" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.126049 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.128705 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-45p4m" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.131511 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.134462 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.136545 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-2t8tm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.137575 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.147496 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.148488 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.150231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26c9b930-41fe-4332-9fde-8c9d4cb304bd" (UID: "26c9b930-41fe-4332-9fde-8c9d4cb304bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.151683 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-jns27" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.152407 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.153308 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djxhq\" (UniqueName: \"kubernetes.io/projected/4cc5be3d-87d8-46a4-ba7d-d95143c11857-kube-api-access-djxhq\") pod \"barbican-operator-controller-manager-868647ff47-z7hnk\" (UID: \"4cc5be3d-87d8-46a4-ba7d-d95143c11857\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.153387 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk5sp\" (UniqueName: \"kubernetes.io/projected/54a55994-69ff-48f1-8d75-24b2a828cdc9-kube-api-access-dk5sp\") pod \"cinder-operator-controller-manager-5d946d989d-vwgdm\" (UID: \"54a55994-69ff-48f1-8d75-24b2a828cdc9\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.153482 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.153498 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfg7g\" (UniqueName: \"kubernetes.io/projected/26c9b930-41fe-4332-9fde-8c9d4cb304bd-kube-api-access-hfg7g\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.153509 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c9b930-41fe-4332-9fde-8c9d4cb304bd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.161588 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.166719 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.167396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.169401 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-zz2j7" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.199305 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.200708 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.202443 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xjfrp" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.209284 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.222077 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.264440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk5sp\" (UniqueName: \"kubernetes.io/projected/54a55994-69ff-48f1-8d75-24b2a828cdc9-kube-api-access-dk5sp\") pod \"cinder-operator-controller-manager-5d946d989d-vwgdm\" (UID: \"54a55994-69ff-48f1-8d75-24b2a828cdc9\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.264529 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqf27\" (UniqueName: \"kubernetes.io/projected/268c2664-09cc-4616-9280-0dd6ae4159dc-kube-api-access-nqf27\") pod \"heat-operator-controller-manager-69f49c598c-shb4d\" (UID: \"268c2664-09cc-4616-9280-0dd6ae4159dc\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.265505 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpt88\" (UniqueName: \"kubernetes.io/projected/d19ed31e-e599-40ec-935d-d1d404e4c7a5-kube-api-access-qpt88\") pod \"glance-operator-controller-manager-77987464f4-5cnjr\" (UID: \"d19ed31e-e599-40ec-935d-d1d404e4c7a5\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.265572 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvbwc\" (UniqueName: \"kubernetes.io/projected/5c867f91-2ab2-43ce-8291-6d01825610d1-kube-api-access-kvbwc\") pod \"designate-operator-controller-manager-6d8bf5c495-d8wqs\" (UID: \"5c867f91-2ab2-43ce-8291-6d01825610d1\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.265648 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djxhq\" (UniqueName: \"kubernetes.io/projected/4cc5be3d-87d8-46a4-ba7d-d95143c11857-kube-api-access-djxhq\") pod \"barbican-operator-controller-manager-868647ff47-z7hnk\" (UID: \"4cc5be3d-87d8-46a4-ba7d-d95143c11857\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.285951 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.287149 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.289154 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4lm4w" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.291941 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djxhq\" (UniqueName: \"kubernetes.io/projected/4cc5be3d-87d8-46a4-ba7d-d95143c11857-kube-api-access-djxhq\") pod \"barbican-operator-controller-manager-868647ff47-z7hnk\" (UID: \"4cc5be3d-87d8-46a4-ba7d-d95143c11857\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.297026 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk5sp\" (UniqueName: \"kubernetes.io/projected/54a55994-69ff-48f1-8d75-24b2a828cdc9-kube-api-access-dk5sp\") pod \"cinder-operator-controller-manager-5d946d989d-vwgdm\" (UID: \"54a55994-69ff-48f1-8d75-24b2a828cdc9\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.299556 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.300433 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.305548 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.305896 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-qzkcm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.313224 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.343897 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.345250 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.347091 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-mwbs5" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.366836 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqf27\" (UniqueName: \"kubernetes.io/projected/268c2664-09cc-4616-9280-0dd6ae4159dc-kube-api-access-nqf27\") pod \"heat-operator-controller-manager-69f49c598c-shb4d\" (UID: \"268c2664-09cc-4616-9280-0dd6ae4159dc\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.366916 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpt88\" (UniqueName: \"kubernetes.io/projected/d19ed31e-e599-40ec-935d-d1d404e4c7a5-kube-api-access-qpt88\") pod \"glance-operator-controller-manager-77987464f4-5cnjr\" (UID: \"d19ed31e-e599-40ec-935d-d1d404e4c7a5\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.366956 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvbwc\" (UniqueName: \"kubernetes.io/projected/5c867f91-2ab2-43ce-8291-6d01825610d1-kube-api-access-kvbwc\") pod \"designate-operator-controller-manager-6d8bf5c495-d8wqs\" (UID: \"5c867f91-2ab2-43ce-8291-6d01825610d1\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.366998 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2qwr\" (UniqueName: \"kubernetes.io/projected/e37494c1-8780-4612-8569-fada28f0e772-kube-api-access-m2qwr\") pod \"horizon-operator-controller-manager-5b9b8895d5-fdd85\" (UID: \"e37494c1-8780-4612-8569-fada28f0e772\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.367027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.367058 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvtht\" (UniqueName: \"kubernetes.io/projected/1d6085d5-f9db-4129-8662-b3ae045decfc-kube-api-access-vvtht\") pod \"manila-operator-controller-manager-54f6768c69-7n98g\" (UID: \"1d6085d5-f9db-4129-8662-b3ae045decfc\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.367089 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgwmv\" (UniqueName: \"kubernetes.io/projected/2e80963b-888b-4bb9-9259-864e38dd10ed-kube-api-access-qgwmv\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.367736 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.371192 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.375802 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-tv49m" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.396585 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.399776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpt88\" (UniqueName: \"kubernetes.io/projected/d19ed31e-e599-40ec-935d-d1d404e4c7a5-kube-api-access-qpt88\") pod \"glance-operator-controller-manager-77987464f4-5cnjr\" (UID: \"d19ed31e-e599-40ec-935d-d1d404e4c7a5\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.399925 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvbwc\" (UniqueName: \"kubernetes.io/projected/5c867f91-2ab2-43ce-8291-6d01825610d1-kube-api-access-kvbwc\") pod \"designate-operator-controller-manager-6d8bf5c495-d8wqs\" (UID: \"5c867f91-2ab2-43ce-8291-6d01825610d1\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.406554 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqf27\" (UniqueName: \"kubernetes.io/projected/268c2664-09cc-4616-9280-0dd6ae4159dc-kube-api-access-nqf27\") pod \"heat-operator-controller-manager-69f49c598c-shb4d\" (UID: \"268c2664-09cc-4616-9280-0dd6ae4159dc\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.414778 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.415686 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.420859 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-gdbsq" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.431002 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.443253 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.449751 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.459413 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.467834 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.468137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.468987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2qwr\" (UniqueName: \"kubernetes.io/projected/e37494c1-8780-4612-8569-fada28f0e772-kube-api-access-m2qwr\") pod \"horizon-operator-controller-manager-5b9b8895d5-fdd85\" (UID: \"e37494c1-8780-4612-8569-fada28f0e772\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.469027 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.469061 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvtht\" (UniqueName: \"kubernetes.io/projected/1d6085d5-f9db-4129-8662-b3ae045decfc-kube-api-access-vvtht\") pod \"manila-operator-controller-manager-54f6768c69-7n98g\" (UID: \"1d6085d5-f9db-4129-8662-b3ae045decfc\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.469086 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgwmv\" (UniqueName: \"kubernetes.io/projected/2e80963b-888b-4bb9-9259-864e38dd10ed-kube-api-access-qgwmv\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.469382 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.469469 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert podName:2e80963b-888b-4bb9-9259-864e38dd10ed nodeName:}" failed. No retries permitted until 2026-02-19 21:43:34.969446932 +0000 UTC m=+926.161964796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert") pod "infra-operator-controller-manager-79d975b745-qjgvw" (UID: "2e80963b-888b-4bb9-9259-864e38dd10ed") : secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.482519 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.482847 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.483350 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.487159 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.488376 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.488388 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-s48hw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.488098 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2qwr\" (UniqueName: \"kubernetes.io/projected/e37494c1-8780-4612-8569-fada28f0e772-kube-api-access-m2qwr\") pod \"horizon-operator-controller-manager-5b9b8895d5-fdd85\" (UID: \"e37494c1-8780-4612-8569-fada28f0e772\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.490070 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-qzpvp" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.491544 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.493726 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvtht\" (UniqueName: \"kubernetes.io/projected/1d6085d5-f9db-4129-8662-b3ae045decfc-kube-api-access-vvtht\") pod \"manila-operator-controller-manager-54f6768c69-7n98g\" (UID: \"1d6085d5-f9db-4129-8662-b3ae045decfc\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.494043 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgwmv\" (UniqueName: \"kubernetes.io/projected/2e80963b-888b-4bb9-9259-864e38dd10ed-kube-api-access-qgwmv\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.495745 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.498589 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.500714 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-gvclq" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.503599 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.513338 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.514110 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.514570 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.517826 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-jbnsb" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.534292 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.538746 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.553899 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.557057 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.558970 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.559157 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-5mql4" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.570549 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztntl\" (UniqueName: \"kubernetes.io/projected/02592cbe-e1d4-4b62-8795-a204d5335594-kube-api-access-ztntl\") pod \"ironic-operator-controller-manager-554564d7fc-t87bb\" (UID: \"02592cbe-e1d4-4b62-8795-a204d5335594\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.570985 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zthv5\" (UniqueName: \"kubernetes.io/projected/c7e19956-a3fb-4ed2-bc2a-72084ed62ac2-kube-api-access-zthv5\") pod \"keystone-operator-controller-manager-b4d948c87-t6hpt\" (UID: \"c7e19956-a3fb-4ed2-bc2a-72084ed62ac2\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.576734 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.577471 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.580778 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-lg9gz" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.587721 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.588602 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.592207 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-thwpb" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.609906 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.613562 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.614311 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.628030 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.628373 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-t4w6j" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.628850 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.649811 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.651563 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.679952 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.680046 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-727pb\" (UniqueName: \"kubernetes.io/projected/0bdb1789-27ad-4535-86d3-fd2fb7cebba2-kube-api-access-727pb\") pod \"mariadb-operator-controller-manager-6994f66f48-mr8mh\" (UID: \"0bdb1789-27ad-4535-86d3-fd2fb7cebba2\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.680089 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zthv5\" (UniqueName: \"kubernetes.io/projected/c7e19956-a3fb-4ed2-bc2a-72084ed62ac2-kube-api-access-zthv5\") pod \"keystone-operator-controller-manager-b4d948c87-t6hpt\" (UID: \"c7e19956-a3fb-4ed2-bc2a-72084ed62ac2\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.680139 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5gh7\" (UniqueName: \"kubernetes.io/projected/c2c4435e-a135-4c1f-bad4-121458c09bc3-kube-api-access-b5gh7\") pod \"neutron-operator-controller-manager-64ddbf8bb-4cf2p\" (UID: \"c2c4435e-a135-4c1f-bad4-121458c09bc3\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.681841 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.682387 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztntl\" (UniqueName: \"kubernetes.io/projected/02592cbe-e1d4-4b62-8795-a204d5335594-kube-api-access-ztntl\") pod \"ironic-operator-controller-manager-554564d7fc-t87bb\" (UID: \"02592cbe-e1d4-4b62-8795-a204d5335594\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.682456 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f88l7\" (UniqueName: \"kubernetes.io/projected/26db9cb2-1ed4-44e4-afac-404ce0f7d445-kube-api-access-f88l7\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.682495 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls2qm\" (UniqueName: \"kubernetes.io/projected/b22b5096-41cf-40c9-94f6-8e546ca96a96-kube-api-access-ls2qm\") pod \"octavia-operator-controller-manager-69f8888797-ckxlw\" (UID: \"b22b5096-41cf-40c9-94f6-8e546ca96a96\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.682539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh2xr\" (UniqueName: \"kubernetes.io/projected/5a6d3cc3-7e00-4013-b568-c2b835d8e2b9-kube-api-access-kh2xr\") pod \"nova-operator-controller-manager-567668f5cf-5b89b\" (UID: \"5a6d3cc3-7e00-4013-b568-c2b835d8e2b9\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.688380 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.693298 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.695552 4795 generic.go:334] "Generic (PLEG): container finished" podID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" containerID="3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d" exitCode=0 Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.695599 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df67c" event={"ID":"26c9b930-41fe-4332-9fde-8c9d4cb304bd","Type":"ContainerDied","Data":"3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d"} Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.695627 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df67c" event={"ID":"26c9b930-41fe-4332-9fde-8c9d4cb304bd","Type":"ContainerDied","Data":"8e2862dfbf3b5bd22726e0781fe62e4bc152e6dd0506beef8a94a47634a3f65a"} Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.695648 4795 scope.go:117] "RemoveContainer" containerID="3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.695767 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df67c" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.697949 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-29gdh" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.724272 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.724327 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztntl\" (UniqueName: \"kubernetes.io/projected/02592cbe-e1d4-4b62-8795-a204d5335594-kube-api-access-ztntl\") pod \"ironic-operator-controller-manager-554564d7fc-t87bb\" (UID: \"02592cbe-e1d4-4b62-8795-a204d5335594\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.739407 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zthv5\" (UniqueName: \"kubernetes.io/projected/c7e19956-a3fb-4ed2-bc2a-72084ed62ac2-kube-api-access-zthv5\") pod \"keystone-operator-controller-manager-b4d948c87-t6hpt\" (UID: \"c7e19956-a3fb-4ed2-bc2a-72084ed62ac2\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.783408 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9dzq\" (UniqueName: \"kubernetes.io/projected/98979ac7-9fb1-49f8-8022-562082fc76f7-kube-api-access-s9dzq\") pod \"swift-operator-controller-manager-68f46476f-dqpjx\" (UID: \"98979ac7-9fb1-49f8-8022-562082fc76f7\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.783615 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f88l7\" (UniqueName: \"kubernetes.io/projected/26db9cb2-1ed4-44e4-afac-404ce0f7d445-kube-api-access-f88l7\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.783700 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls2qm\" (UniqueName: \"kubernetes.io/projected/b22b5096-41cf-40c9-94f6-8e546ca96a96-kube-api-access-ls2qm\") pod \"octavia-operator-controller-manager-69f8888797-ckxlw\" (UID: \"b22b5096-41cf-40c9-94f6-8e546ca96a96\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.784326 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh2xr\" (UniqueName: \"kubernetes.io/projected/5a6d3cc3-7e00-4013-b568-c2b835d8e2b9-kube-api-access-kh2xr\") pod \"nova-operator-controller-manager-567668f5cf-5b89b\" (UID: \"5a6d3cc3-7e00-4013-b568-c2b835d8e2b9\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.784432 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.784515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-727pb\" (UniqueName: \"kubernetes.io/projected/0bdb1789-27ad-4535-86d3-fd2fb7cebba2-kube-api-access-727pb\") pod \"mariadb-operator-controller-manager-6994f66f48-mr8mh\" (UID: \"0bdb1789-27ad-4535-86d3-fd2fb7cebba2\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.784612 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mzr5\" (UniqueName: \"kubernetes.io/projected/7b637620-f307-4e2b-b92d-f1e0d50b0071-kube-api-access-2mzr5\") pod \"placement-operator-controller-manager-8497b45c89-bwsj2\" (UID: \"7b637620-f307-4e2b-b92d-f1e0d50b0071\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.784686 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwm2j\" (UniqueName: \"kubernetes.io/projected/e0cad59b-249e-446f-b3fa-6be8aac2a858-kube-api-access-jwm2j\") pod \"ovn-operator-controller-manager-d44cf6b75-slqxz\" (UID: \"e0cad59b-249e-446f-b3fa-6be8aac2a858\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.790586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5gh7\" (UniqueName: \"kubernetes.io/projected/c2c4435e-a135-4c1f-bad4-121458c09bc3-kube-api-access-b5gh7\") pod \"neutron-operator-controller-manager-64ddbf8bb-4cf2p\" (UID: \"c2c4435e-a135-4c1f-bad4-121458c09bc3\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.786897 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-rcjgz"] Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.785105 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.791089 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert podName:26db9cb2-1ed4-44e4-afac-404ce0f7d445 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:35.291066299 +0000 UTC m=+926.483584163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" (UID: "26db9cb2-1ed4-44e4-afac-404ce0f7d445") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.791898 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-rcjgz"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.792025 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.816775 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5gh7\" (UniqueName: \"kubernetes.io/projected/c2c4435e-a135-4c1f-bad4-121458c09bc3-kube-api-access-b5gh7\") pod \"neutron-operator-controller-manager-64ddbf8bb-4cf2p\" (UID: \"c2c4435e-a135-4c1f-bad4-121458c09bc3\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.817185 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls2qm\" (UniqueName: \"kubernetes.io/projected/b22b5096-41cf-40c9-94f6-8e546ca96a96-kube-api-access-ls2qm\") pod \"octavia-operator-controller-manager-69f8888797-ckxlw\" (UID: \"b22b5096-41cf-40c9-94f6-8e546ca96a96\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.817219 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f88l7\" (UniqueName: \"kubernetes.io/projected/26db9cb2-1ed4-44e4-afac-404ce0f7d445-kube-api-access-f88l7\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.817908 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh2xr\" (UniqueName: \"kubernetes.io/projected/5a6d3cc3-7e00-4013-b568-c2b835d8e2b9-kube-api-access-kh2xr\") pod \"nova-operator-controller-manager-567668f5cf-5b89b\" (UID: \"5a6d3cc3-7e00-4013-b568-c2b835d8e2b9\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.820786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-727pb\" (UniqueName: \"kubernetes.io/projected/0bdb1789-27ad-4535-86d3-fd2fb7cebba2-kube-api-access-727pb\") pod \"mariadb-operator-controller-manager-6994f66f48-mr8mh\" (UID: \"0bdb1789-27ad-4535-86d3-fd2fb7cebba2\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.827185 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-pwwfk" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.841284 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.842412 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.842633 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.850255 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5mjnc" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.864770 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.867551 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.873359 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.891298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7d95\" (UniqueName: \"kubernetes.io/projected/5f4d8698-27a0-44a4-87f6-c75d4c3407bc-kube-api-access-t7d95\") pod \"telemetry-operator-controller-manager-7f45b4ff68-slj65\" (UID: \"5f4d8698-27a0-44a4-87f6-c75d4c3407bc\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.891508 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddk6m\" (UniqueName: \"kubernetes.io/projected/6bdc9c62-d8c1-42d5-8696-324fdc7abc2f-kube-api-access-ddk6m\") pod \"test-operator-controller-manager-7866795846-rcjgz\" (UID: \"6bdc9c62-d8c1-42d5-8696-324fdc7abc2f\") " pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.891630 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9dzq\" (UniqueName: \"kubernetes.io/projected/98979ac7-9fb1-49f8-8022-562082fc76f7-kube-api-access-s9dzq\") pod \"swift-operator-controller-manager-68f46476f-dqpjx\" (UID: \"98979ac7-9fb1-49f8-8022-562082fc76f7\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.891817 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt99q\" (UniqueName: \"kubernetes.io/projected/09ce2dcf-0fb0-4180-a019-09d1abfec00e-kube-api-access-wt99q\") pod \"watcher-operator-controller-manager-5db88f68c-bbtgm\" (UID: \"09ce2dcf-0fb0-4180-a019-09d1abfec00e\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.891978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mzr5\" (UniqueName: \"kubernetes.io/projected/7b637620-f307-4e2b-b92d-f1e0d50b0071-kube-api-access-2mzr5\") pod \"placement-operator-controller-manager-8497b45c89-bwsj2\" (UID: \"7b637620-f307-4e2b-b92d-f1e0d50b0071\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.892137 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwm2j\" (UniqueName: \"kubernetes.io/projected/e0cad59b-249e-446f-b3fa-6be8aac2a858-kube-api-access-jwm2j\") pod \"ovn-operator-controller-manager-d44cf6b75-slqxz\" (UID: \"e0cad59b-249e-446f-b3fa-6be8aac2a858\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.923215 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.924208 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.928787 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-df67c"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.932219 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mzr5\" (UniqueName: \"kubernetes.io/projected/7b637620-f307-4e2b-b92d-f1e0d50b0071-kube-api-access-2mzr5\") pod \"placement-operator-controller-manager-8497b45c89-bwsj2\" (UID: \"7b637620-f307-4e2b-b92d-f1e0d50b0071\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.950590 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-klpqt" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.950770 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.950863 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.951416 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwm2j\" (UniqueName: \"kubernetes.io/projected/e0cad59b-249e-446f-b3fa-6be8aac2a858-kube-api-access-jwm2j\") pod \"ovn-operator-controller-manager-d44cf6b75-slqxz\" (UID: \"e0cad59b-249e-446f-b3fa-6be8aac2a858\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.960272 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-df67c"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.960546 4795 scope.go:117] "RemoveContainer" containerID="673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.976594 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp"] Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.981026 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9dzq\" (UniqueName: \"kubernetes.io/projected/98979ac7-9fb1-49f8-8022-562082fc76f7-kube-api-access-s9dzq\") pod \"swift-operator-controller-manager-68f46476f-dqpjx\" (UID: \"98979ac7-9fb1-49f8-8022-562082fc76f7\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.992504 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.993433 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7d95\" (UniqueName: \"kubernetes.io/projected/5f4d8698-27a0-44a4-87f6-c75d4c3407bc-kube-api-access-t7d95\") pod \"telemetry-operator-controller-manager-7f45b4ff68-slj65\" (UID: \"5f4d8698-27a0-44a4-87f6-c75d4c3407bc\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.994692 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.994783 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.994868 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.994944 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqq8d\" (UniqueName: \"kubernetes.io/projected/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-kube-api-access-pqq8d\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.995038 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddk6m\" (UniqueName: \"kubernetes.io/projected/6bdc9c62-d8c1-42d5-8696-324fdc7abc2f-kube-api-access-ddk6m\") pod \"test-operator-controller-manager-7866795846-rcjgz\" (UID: \"6bdc9c62-d8c1-42d5-8696-324fdc7abc2f\") " pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.995136 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt99q\" (UniqueName: \"kubernetes.io/projected/09ce2dcf-0fb0-4180-a019-09d1abfec00e-kube-api-access-wt99q\") pod \"watcher-operator-controller-manager-5db88f68c-bbtgm\" (UID: \"09ce2dcf-0fb0-4180-a019-09d1abfec00e\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.995616 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:34 crc kubenswrapper[4795]: E0219 21:43:34.996619 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert podName:2e80963b-888b-4bb9-9259-864e38dd10ed nodeName:}" failed. No retries permitted until 2026-02-19 21:43:35.996604267 +0000 UTC m=+927.189122131 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert") pod "infra-operator-controller-manager-79d975b745-qjgvw" (UID: "2e80963b-888b-4bb9-9259-864e38dd10ed") : secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:34 crc kubenswrapper[4795]: I0219 21:43:34.996891 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.034298 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.035379 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.036248 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.043036 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-phdgp" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.047964 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt99q\" (UniqueName: \"kubernetes.io/projected/09ce2dcf-0fb0-4180-a019-09d1abfec00e-kube-api-access-wt99q\") pod \"watcher-operator-controller-manager-5db88f68c-bbtgm\" (UID: \"09ce2dcf-0fb0-4180-a019-09d1abfec00e\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.051807 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7d95\" (UniqueName: \"kubernetes.io/projected/5f4d8698-27a0-44a4-87f6-c75d4c3407bc-kube-api-access-t7d95\") pod \"telemetry-operator-controller-manager-7f45b4ff68-slj65\" (UID: \"5f4d8698-27a0-44a4-87f6-c75d4c3407bc\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.065313 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.085804 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddk6m\" (UniqueName: \"kubernetes.io/projected/6bdc9c62-d8c1-42d5-8696-324fdc7abc2f-kube-api-access-ddk6m\") pod \"test-operator-controller-manager-7866795846-rcjgz\" (UID: \"6bdc9c62-d8c1-42d5-8696-324fdc7abc2f\") " pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.096367 4795 scope.go:117] "RemoveContainer" containerID="aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.114602 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.115093 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.115134 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.115183 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqq8d\" (UniqueName: \"kubernetes.io/projected/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-kube-api-access-pqq8d\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.115201 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.115212 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4z44\" (UniqueName: \"kubernetes.io/projected/80ce3bc1-0926-47a3-acc2-6f2d8be4089c-kube-api-access-v4z44\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vv89z\" (UID: \"80ce3bc1-0926-47a3-acc2-6f2d8be4089c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.115374 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.126192 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:35.615227419 +0000 UTC m=+926.807745283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "metrics-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.126268 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:35.62624118 +0000 UTC m=+926.818759044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "webhook-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.142224 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqq8d\" (UniqueName: \"kubernetes.io/projected/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-kube-api-access-pqq8d\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.183132 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.193350 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.199418 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.216917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4z44\" (UniqueName: \"kubernetes.io/projected/80ce3bc1-0926-47a3-acc2-6f2d8be4089c-kube-api-access-v4z44\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vv89z\" (UID: \"80ce3bc1-0926-47a3-acc2-6f2d8be4089c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.226680 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.243191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4z44\" (UniqueName: \"kubernetes.io/projected/80ce3bc1-0926-47a3-acc2-6f2d8be4089c-kube-api-access-v4z44\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vv89z\" (UID: \"80ce3bc1-0926-47a3-acc2-6f2d8be4089c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.243708 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.251357 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.310246 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.317733 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.318016 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.318073 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert podName:26db9cb2-1ed4-44e4-afac-404ce0f7d445 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:36.31805389 +0000 UTC m=+927.510571754 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" (UID: "26db9cb2-1ed4-44e4-afac-404ce0f7d445") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.322530 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.370253 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.371392 4795 scope.go:117] "RemoveContainer" containerID="3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d" Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.378945 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d\": container with ID starting with 3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d not found: ID does not exist" containerID="3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.378982 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d"} err="failed to get container status \"3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d\": rpc error: code = NotFound desc = could not find container \"3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d\": container with ID starting with 3a19332ea30d3f30bc9024c2c11ccf1aebf8403a82db1f78a705c81e94520d3d not found: ID does not exist" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.379008 4795 scope.go:117] "RemoveContainer" containerID="673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8" Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.379354 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8\": container with ID starting with 673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8 not found: ID does not exist" containerID="673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.379384 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8"} err="failed to get container status \"673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8\": rpc error: code = NotFound desc = could not find container \"673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8\": container with ID starting with 673e5888e18f68030d508c2fc17e9b749479219a1048baa8b20751a224c8aac8 not found: ID does not exist" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.379405 4795 scope.go:117] "RemoveContainer" containerID="aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61" Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.383557 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61\": container with ID starting with aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61 not found: ID does not exist" containerID="aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.383584 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61"} err="failed to get container status \"aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61\": rpc error: code = NotFound desc = could not find container \"aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61\": container with ID starting with aad3657be5b442a652805a5a7969b9cf191ab890426ca287320693f93c38eb61 not found: ID does not exist" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.407701 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.477977 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr"] Feb 19 21:43:35 crc kubenswrapper[4795]: W0219 21:43:35.494890 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd19ed31e_e599_40ec_935d_d1d404e4c7a5.slice/crio-d2c575c923d5e6900d2cf0a9f79882dcba70089d5ac5991a65d2fca9e8238658 WatchSource:0}: Error finding container d2c575c923d5e6900d2cf0a9f79882dcba70089d5ac5991a65d2fca9e8238658: Status 404 returned error can't find the container with id d2c575c923d5e6900d2cf0a9f79882dcba70089d5ac5991a65d2fca9e8238658 Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.557899 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26c9b930-41fe-4332-9fde-8c9d4cb304bd" path="/var/lib/kubelet/pods/26c9b930-41fe-4332-9fde-8c9d4cb304bd/volumes" Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.621830 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.622485 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.622532 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:36.622517033 +0000 UTC m=+927.815034897 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "metrics-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.684403 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.692345 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g"] Feb 19 21:43:35 crc kubenswrapper[4795]: W0219 21:43:35.694816 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d6085d5_f9db_4129_8662_b3ae045decfc.slice/crio-bd157e371be79bf57fb80026e876d9c8dfdaefde324a2140c8aa008288f61573 WatchSource:0}: Error finding container bd157e371be79bf57fb80026e876d9c8dfdaefde324a2140c8aa008288f61573: Status 404 returned error can't find the container with id bd157e371be79bf57fb80026e876d9c8dfdaefde324a2140c8aa008288f61573 Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.695938 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.714692 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.720237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" event={"ID":"1d6085d5-f9db-4129-8662-b3ae045decfc","Type":"ContainerStarted","Data":"bd157e371be79bf57fb80026e876d9c8dfdaefde324a2140c8aa008288f61573"} Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.723596 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.723801 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.723800 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" event={"ID":"268c2664-09cc-4616-9280-0dd6ae4159dc","Type":"ContainerStarted","Data":"dbab9616bea5a84cb7d46cc57b72722a137e31777aad2b53dbae04e18f1628d5"} Feb 19 21:43:35 crc kubenswrapper[4795]: E0219 21:43:35.723870 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:36.723851296 +0000 UTC m=+927.916369160 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "webhook-server-cert" not found Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.739614 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" event={"ID":"d19ed31e-e599-40ec-935d-d1d404e4c7a5","Type":"ContainerStarted","Data":"d2c575c923d5e6900d2cf0a9f79882dcba70089d5ac5991a65d2fca9e8238658"} Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.753360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" event={"ID":"5a6d3cc3-7e00-4013-b568-c2b835d8e2b9","Type":"ContainerStarted","Data":"c442b135c36e227a697a05c9c223e8910e7845324206053b69553f920d6dae2f"} Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.757999 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" event={"ID":"54a55994-69ff-48f1-8d75-24b2a828cdc9","Type":"ContainerStarted","Data":"ec41ac1e15fc06b07b215c33a62156df782759530151296fe337b5ee1b4cbc9e"} Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.763224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" event={"ID":"e37494c1-8780-4612-8569-fada28f0e772","Type":"ContainerStarted","Data":"bdcdbd7c44d8abf1e51f5ed3ab80789b921f7f8b2debfa911d219347f82974a8"} Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.763956 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" event={"ID":"4cc5be3d-87d8-46a4-ba7d-d95143c11857","Type":"ContainerStarted","Data":"4e05aebfdda0466144dc837fafb427636ab38e5d9694f67c8f037795a95cd510"} Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.765357 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" event={"ID":"5c867f91-2ab2-43ce-8291-6d01825610d1","Type":"ContainerStarted","Data":"61d46d7d487efb406fc03aa243b02a9fbd8b5ab16421bcd7249fc64a21e01325"} Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.915087 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw"] Feb 19 21:43:35 crc kubenswrapper[4795]: I0219 21:43:35.920546 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb"] Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.028922 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.029091 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.029146 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert podName:2e80963b-888b-4bb9-9259-864e38dd10ed nodeName:}" failed. No retries permitted until 2026-02-19 21:43:38.029125802 +0000 UTC m=+929.221643666 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert") pod "infra-operator-controller-manager-79d975b745-qjgvw" (UID: "2e80963b-888b-4bb9-9259-864e38dd10ed") : secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.032797 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh"] Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.040611 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt"] Feb 19 21:43:36 crc kubenswrapper[4795]: W0219 21:43:36.042757 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bdb1789_27ad_4535_86d3_fd2fb7cebba2.slice/crio-fb14ff320b3a3aaf3a51fb49d16ce7882de2fab247ed18a55df683e0eaa7c00c WatchSource:0}: Error finding container fb14ff320b3a3aaf3a51fb49d16ce7882de2fab247ed18a55df683e0eaa7c00c: Status 404 returned error can't find the container with id fb14ff320b3a3aaf3a51fb49d16ce7882de2fab247ed18a55df683e0eaa7c00c Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.063157 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2"] Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.083882 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65"] Feb 19 21:43:36 crc kubenswrapper[4795]: W0219 21:43:36.102207 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f4d8698_27a0_44a4_87f6_c75d4c3407bc.slice/crio-d3ae5512b6d771f06c9ea5013c41ee2499e5ba701bf11db4e1010a2c1069e58c WatchSource:0}: Error finding container d3ae5512b6d771f06c9ea5013c41ee2499e5ba701bf11db4e1010a2c1069e58c: Status 404 returned error can't find the container with id d3ae5512b6d771f06c9ea5013c41ee2499e5ba701bf11db4e1010a2c1069e58c Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.108912 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t7d95,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-slj65_openstack-operators(5f4d8698-27a0-44a4-87f6-c75d4c3407bc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.110112 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" podUID="5f4d8698-27a0-44a4-87f6-c75d4c3407bc" Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.157266 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz"] Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.164271 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx"] Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.173443 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm"] Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.184246 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z"] Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.190378 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s9dzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-dqpjx_openstack-operators(98979ac7-9fb1-49f8-8022-562082fc76f7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.191548 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" podUID="98979ac7-9fb1-49f8-8022-562082fc76f7" Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.194138 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-rcjgz"] Feb 19 21:43:36 crc kubenswrapper[4795]: W0219 21:43:36.204064 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09ce2dcf_0fb0_4180_a019_09d1abfec00e.slice/crio-5824c77eb3b45d38fec95d4bcff01b43ae75b9cc50186a78ae84a019839beb21 WatchSource:0}: Error finding container 5824c77eb3b45d38fec95d4bcff01b43ae75b9cc50186a78ae84a019839beb21: Status 404 returned error can't find the container with id 5824c77eb3b45d38fec95d4bcff01b43ae75b9cc50186a78ae84a019839beb21 Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.204093 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jwm2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-slqxz_openstack-operators(e0cad59b-249e-446f-b3fa-6be8aac2a858): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.205337 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" podUID="e0cad59b-249e-446f-b3fa-6be8aac2a858" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.208109 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ddk6m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-rcjgz_openstack-operators(6bdc9c62-d8c1-42d5-8696-324fdc7abc2f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.209521 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" podUID="6bdc9c62-d8c1-42d5-8696-324fdc7abc2f" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.209980 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wt99q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-bbtgm_openstack-operators(09ce2dcf-0fb0-4180-a019-09d1abfec00e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.211481 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" podUID="09ce2dcf-0fb0-4180-a019-09d1abfec00e" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.216131 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v4z44,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-vv89z_openstack-operators(80ce3bc1-0926-47a3-acc2-6f2d8be4089c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.217404 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" podUID="80ce3bc1-0926-47a3-acc2-6f2d8be4089c" Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.333981 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.334213 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.334263 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert podName:26db9cb2-1ed4-44e4-afac-404ce0f7d445 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:38.334249264 +0000 UTC m=+929.526767128 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" (UID: "26db9cb2-1ed4-44e4-afac-404ce0f7d445") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.639901 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.640554 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.640651 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:38.640627921 +0000 UTC m=+929.833145865 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "metrics-server-cert" not found Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.746644 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.746850 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.746894 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:38.746881373 +0000 UTC m=+929.939399227 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "webhook-server-cert" not found Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.773736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" event={"ID":"c7e19956-a3fb-4ed2-bc2a-72084ed62ac2","Type":"ContainerStarted","Data":"5d353edfcb8c604eb1cfa8409cd495d1d491bf5065c9abafd939379719b91d6a"} Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.802527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" event={"ID":"98979ac7-9fb1-49f8-8022-562082fc76f7","Type":"ContainerStarted","Data":"02754dbae3ac80efd2c379255bf0881f0a30ed60550f13423a78abe03bad1451"} Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.805354 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" event={"ID":"80ce3bc1-0926-47a3-acc2-6f2d8be4089c","Type":"ContainerStarted","Data":"e55431213b3390e8940335b813260dfdcb7498e87332c33f7dfb63eed87e2ca8"} Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.806179 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" podUID="98979ac7-9fb1-49f8-8022-562082fc76f7" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.807623 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" podUID="80ce3bc1-0926-47a3-acc2-6f2d8be4089c" Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.807754 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" event={"ID":"02592cbe-e1d4-4b62-8795-a204d5335594","Type":"ContainerStarted","Data":"f57671b80c6123511d81b1228f9e92b126eb6c03e07be07f6e9ba376214d0eec"} Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.810132 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" event={"ID":"b22b5096-41cf-40c9-94f6-8e546ca96a96","Type":"ContainerStarted","Data":"1988afd4158cccf6c904b3abc3ba0130736308a4a12616377d663f3c7f44174b"} Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.814279 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" event={"ID":"5f4d8698-27a0-44a4-87f6-c75d4c3407bc","Type":"ContainerStarted","Data":"d3ae5512b6d771f06c9ea5013c41ee2499e5ba701bf11db4e1010a2c1069e58c"} Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.817297 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" podUID="5f4d8698-27a0-44a4-87f6-c75d4c3407bc" Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.817408 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" event={"ID":"e0cad59b-249e-446f-b3fa-6be8aac2a858","Type":"ContainerStarted","Data":"b42734073de670a5b09af026b396e64c976c50a1d7d4cfd4cbd1ee0433a4c647"} Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.829237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" event={"ID":"09ce2dcf-0fb0-4180-a019-09d1abfec00e","Type":"ContainerStarted","Data":"5824c77eb3b45d38fec95d4bcff01b43ae75b9cc50186a78ae84a019839beb21"} Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.829265 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" podUID="e0cad59b-249e-446f-b3fa-6be8aac2a858" Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.832499 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" podUID="09ce2dcf-0fb0-4180-a019-09d1abfec00e" Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.835315 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" event={"ID":"7b637620-f307-4e2b-b92d-f1e0d50b0071","Type":"ContainerStarted","Data":"5215322d452d12698484c8773ec7ee6c3b7d76bf9e77dde4163888da83ce5cbb"} Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.838540 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" event={"ID":"c2c4435e-a135-4c1f-bad4-121458c09bc3","Type":"ContainerStarted","Data":"a1db2a335a9378e0436bd591daea8d8a60409947f80d08a27f62a766ff23baf4"} Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.841905 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" event={"ID":"6bdc9c62-d8c1-42d5-8696-324fdc7abc2f","Type":"ContainerStarted","Data":"f219ba887bb2a5471b369e9ac7aab55f74556020daa5688afc8d7591e3f3d55a"} Feb 19 21:43:36 crc kubenswrapper[4795]: E0219 21:43:36.842961 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" podUID="6bdc9c62-d8c1-42d5-8696-324fdc7abc2f" Feb 19 21:43:36 crc kubenswrapper[4795]: I0219 21:43:36.844736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" event={"ID":"0bdb1789-27ad-4535-86d3-fd2fb7cebba2","Type":"ContainerStarted","Data":"fb14ff320b3a3aaf3a51fb49d16ce7882de2fab247ed18a55df683e0eaa7c00c"} Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.488401 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k4g6z"] Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.492330 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.509611 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4g6z"] Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.560879 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm6kz\" (UniqueName: \"kubernetes.io/projected/487e5483-759b-49c8-a347-f9a3ecd255ff-kube-api-access-vm6kz\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.560987 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-catalog-content\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.561073 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-utilities\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.662782 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-utilities\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.662900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm6kz\" (UniqueName: \"kubernetes.io/projected/487e5483-759b-49c8-a347-f9a3ecd255ff-kube-api-access-vm6kz\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.662939 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-catalog-content\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.663376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-catalog-content\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.663612 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-utilities\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.684182 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm6kz\" (UniqueName: \"kubernetes.io/projected/487e5483-759b-49c8-a347-f9a3ecd255ff-kube-api-access-vm6kz\") pod \"redhat-marketplace-k4g6z\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: I0219 21:43:37.832582 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:43:37 crc kubenswrapper[4795]: E0219 21:43:37.852739 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" podUID="98979ac7-9fb1-49f8-8022-562082fc76f7" Feb 19 21:43:37 crc kubenswrapper[4795]: E0219 21:43:37.853622 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" podUID="6bdc9c62-d8c1-42d5-8696-324fdc7abc2f" Feb 19 21:43:37 crc kubenswrapper[4795]: E0219 21:43:37.853960 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" podUID="e0cad59b-249e-446f-b3fa-6be8aac2a858" Feb 19 21:43:37 crc kubenswrapper[4795]: E0219 21:43:37.854044 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" podUID="09ce2dcf-0fb0-4180-a019-09d1abfec00e" Feb 19 21:43:37 crc kubenswrapper[4795]: E0219 21:43:37.854127 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" podUID="5f4d8698-27a0-44a4-87f6-c75d4c3407bc" Feb 19 21:43:37 crc kubenswrapper[4795]: E0219 21:43:37.861058 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" podUID="80ce3bc1-0926-47a3-acc2-6f2d8be4089c" Feb 19 21:43:38 crc kubenswrapper[4795]: I0219 21:43:38.068922 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:38 crc kubenswrapper[4795]: E0219 21:43:38.069405 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:38 crc kubenswrapper[4795]: E0219 21:43:38.069454 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert podName:2e80963b-888b-4bb9-9259-864e38dd10ed nodeName:}" failed. No retries permitted until 2026-02-19 21:43:42.069437833 +0000 UTC m=+933.261955697 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert") pod "infra-operator-controller-manager-79d975b745-qjgvw" (UID: "2e80963b-888b-4bb9-9259-864e38dd10ed") : secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:38 crc kubenswrapper[4795]: I0219 21:43:38.373414 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:38 crc kubenswrapper[4795]: E0219 21:43:38.373589 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:38 crc kubenswrapper[4795]: E0219 21:43:38.373661 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert podName:26db9cb2-1ed4-44e4-afac-404ce0f7d445 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:42.373637138 +0000 UTC m=+933.566155002 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" (UID: "26db9cb2-1ed4-44e4-afac-404ce0f7d445") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:38 crc kubenswrapper[4795]: I0219 21:43:38.678322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:38 crc kubenswrapper[4795]: E0219 21:43:38.678542 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:43:38 crc kubenswrapper[4795]: E0219 21:43:38.678590 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:42.678576575 +0000 UTC m=+933.871094439 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "metrics-server-cert" not found Feb 19 21:43:38 crc kubenswrapper[4795]: I0219 21:43:38.779493 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:38 crc kubenswrapper[4795]: E0219 21:43:38.779691 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:43:38 crc kubenswrapper[4795]: E0219 21:43:38.779743 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:42.779729263 +0000 UTC m=+933.972247117 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "webhook-server-cert" not found Feb 19 21:43:42 crc kubenswrapper[4795]: I0219 21:43:42.142516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:42 crc kubenswrapper[4795]: E0219 21:43:42.142771 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:42 crc kubenswrapper[4795]: E0219 21:43:42.143505 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert podName:2e80963b-888b-4bb9-9259-864e38dd10ed nodeName:}" failed. No retries permitted until 2026-02-19 21:43:50.143437088 +0000 UTC m=+941.335954992 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert") pod "infra-operator-controller-manager-79d975b745-qjgvw" (UID: "2e80963b-888b-4bb9-9259-864e38dd10ed") : secret "infra-operator-webhook-server-cert" not found Feb 19 21:43:42 crc kubenswrapper[4795]: I0219 21:43:42.450613 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:42 crc kubenswrapper[4795]: E0219 21:43:42.450830 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:42 crc kubenswrapper[4795]: E0219 21:43:42.450883 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert podName:26db9cb2-1ed4-44e4-afac-404ce0f7d445 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:50.450867495 +0000 UTC m=+941.643385369 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" (UID: "26db9cb2-1ed4-44e4-afac-404ce0f7d445") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:42 crc kubenswrapper[4795]: I0219 21:43:42.755019 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:42 crc kubenswrapper[4795]: E0219 21:43:42.755256 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:43:42 crc kubenswrapper[4795]: E0219 21:43:42.755335 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:50.755314647 +0000 UTC m=+941.947832511 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "metrics-server-cert" not found Feb 19 21:43:42 crc kubenswrapper[4795]: I0219 21:43:42.856540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:42 crc kubenswrapper[4795]: E0219 21:43:42.856692 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:43:42 crc kubenswrapper[4795]: E0219 21:43:42.856744 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:43:50.856730233 +0000 UTC m=+942.049248097 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "webhook-server-cert" not found Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.316637 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4g6z"] Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.933145 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" event={"ID":"1d6085d5-f9db-4129-8662-b3ae045decfc","Type":"ContainerStarted","Data":"e07793e8fd9993a5fe24a5198c679587d953637162c90794139ae2c623166c0e"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.933619 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.934924 4795 generic.go:334] "Generic (PLEG): container finished" podID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerID="1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d" exitCode=0 Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.935291 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4g6z" event={"ID":"487e5483-759b-49c8-a347-f9a3ecd255ff","Type":"ContainerDied","Data":"1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.935407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4g6z" event={"ID":"487e5483-759b-49c8-a347-f9a3ecd255ff","Type":"ContainerStarted","Data":"bd29711f0be87b24f29b013ce08c362b1111de6a1cccb18d373555addaf3b5a6"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.939543 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" event={"ID":"7b637620-f307-4e2b-b92d-f1e0d50b0071","Type":"ContainerStarted","Data":"f83a4cba0834bf1e9732d279c5146b84203dda4a49f7281d60b184f37eb9c589"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.939869 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.941102 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" event={"ID":"5a6d3cc3-7e00-4013-b568-c2b835d8e2b9","Type":"ContainerStarted","Data":"0e508f6ef4573dd484b05d855a79b9423abb7f3ebeeaff3ae0bf8590d64b1fd0"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.941229 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.943937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" event={"ID":"c7e19956-a3fb-4ed2-bc2a-72084ed62ac2","Type":"ContainerStarted","Data":"8e5498bd191e5e827ce0256522cb08fecd273043a3be7d1ec3efef0a0e5657cc"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.945635 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.947790 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" event={"ID":"b22b5096-41cf-40c9-94f6-8e546ca96a96","Type":"ContainerStarted","Data":"eab31bd5755bccd99d04d6cb6514e2450dcf22deb0d927a56a33ef797b39f7e1"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.948090 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.949287 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" event={"ID":"4cc5be3d-87d8-46a4-ba7d-d95143c11857","Type":"ContainerStarted","Data":"8281c9187b09a72d046cf8480d4a52b47744eaf9e077470d7bc2223b97c45f12"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.950023 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.952721 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" event={"ID":"d19ed31e-e599-40ec-935d-d1d404e4c7a5","Type":"ContainerStarted","Data":"1a9e5d6728b789d5336943f354758c7acc3fb68672cc966429421a5137abaefb"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.952946 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.954376 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" event={"ID":"0bdb1789-27ad-4535-86d3-fd2fb7cebba2","Type":"ContainerStarted","Data":"69b78e9b774021411809dd96659898820a7a2f10da65bbbd0b7757dcbed0404d"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.954642 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.955414 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" podStartSLOduration=2.684259462 podStartE2EDuration="14.955400336s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.697408809 +0000 UTC m=+926.889926673" lastFinishedPulling="2026-02-19 21:43:47.968549643 +0000 UTC m=+939.161067547" observedRunningTime="2026-02-19 21:43:48.953699998 +0000 UTC m=+940.146217882" watchObservedRunningTime="2026-02-19 21:43:48.955400336 +0000 UTC m=+940.147918200" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.956329 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" event={"ID":"02592cbe-e1d4-4b62-8795-a204d5335594","Type":"ContainerStarted","Data":"e395a5835e9712e6734ed3c2badde6994d4618b6d22c115cf5f06d12296d9660"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.957056 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.962343 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" event={"ID":"54a55994-69ff-48f1-8d75-24b2a828cdc9","Type":"ContainerStarted","Data":"2fece62e050d1ad6233bd56c613c1aba27b5a19766058e19acd0b5031dd3f1e0"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.963051 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.964425 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" event={"ID":"e37494c1-8780-4612-8569-fada28f0e772","Type":"ContainerStarted","Data":"625a35d65b18d26d297f052bef896978805ae6b3620dc92bc4437700ff7578b5"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.964763 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.965783 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" event={"ID":"268c2664-09cc-4616-9280-0dd6ae4159dc","Type":"ContainerStarted","Data":"c66691bffe40d94b042e37112c9f486a4c6411996b806e5be0ef638c78a6f404"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.966098 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.967240 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" event={"ID":"c2c4435e-a135-4c1f-bad4-121458c09bc3","Type":"ContainerStarted","Data":"5956cb22ea93c0bc00613a41d173d7120d95d8d7521d324d485b64492bbe63a5"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.967581 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.971867 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" event={"ID":"5c867f91-2ab2-43ce-8291-6d01825610d1","Type":"ContainerStarted","Data":"5daeceb4704e6cb53982dcd2636fc941537acca135b4dcb774dbaeb82f6debb9"} Feb 19 21:43:48 crc kubenswrapper[4795]: I0219 21:43:48.972061 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.001244 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" podStartSLOduration=2.40987124 podStartE2EDuration="15.001221701s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.498987683 +0000 UTC m=+926.691505547" lastFinishedPulling="2026-02-19 21:43:48.090338134 +0000 UTC m=+939.282856008" observedRunningTime="2026-02-19 21:43:49.000631494 +0000 UTC m=+940.193149358" watchObservedRunningTime="2026-02-19 21:43:49.001221701 +0000 UTC m=+940.193739575" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.019765 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" podStartSLOduration=2.622914529 podStartE2EDuration="15.019749124s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.694917829 +0000 UTC m=+926.887435693" lastFinishedPulling="2026-02-19 21:43:48.091752414 +0000 UTC m=+939.284270288" observedRunningTime="2026-02-19 21:43:49.018782497 +0000 UTC m=+940.211300361" watchObservedRunningTime="2026-02-19 21:43:49.019749124 +0000 UTC m=+940.212266988" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.039635 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" podStartSLOduration=2.9260576560000002 podStartE2EDuration="15.039617146s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.932621976 +0000 UTC m=+927.125139840" lastFinishedPulling="2026-02-19 21:43:48.046181446 +0000 UTC m=+939.238699330" observedRunningTime="2026-02-19 21:43:49.036800026 +0000 UTC m=+940.229317890" watchObservedRunningTime="2026-02-19 21:43:49.039617146 +0000 UTC m=+940.232135010" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.085748 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" podStartSLOduration=3.123042732 podStartE2EDuration="15.085730989s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.103218136 +0000 UTC m=+927.295736000" lastFinishedPulling="2026-02-19 21:43:48.065906383 +0000 UTC m=+939.258424257" observedRunningTime="2026-02-19 21:43:49.083927468 +0000 UTC m=+940.276445332" watchObservedRunningTime="2026-02-19 21:43:49.085730989 +0000 UTC m=+940.278248843" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.086465 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" podStartSLOduration=2.391954403 podStartE2EDuration="15.086459649s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.273965644 +0000 UTC m=+926.466483508" lastFinishedPulling="2026-02-19 21:43:47.96847088 +0000 UTC m=+939.160988754" observedRunningTime="2026-02-19 21:43:49.06630081 +0000 UTC m=+940.258818664" watchObservedRunningTime="2026-02-19 21:43:49.086459649 +0000 UTC m=+940.278977513" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.133639 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" podStartSLOduration=3.117958178 podStartE2EDuration="15.133619272s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.077205921 +0000 UTC m=+927.269723785" lastFinishedPulling="2026-02-19 21:43:48.092866995 +0000 UTC m=+939.285384879" observedRunningTime="2026-02-19 21:43:49.129947608 +0000 UTC m=+940.322465472" watchObservedRunningTime="2026-02-19 21:43:49.133619272 +0000 UTC m=+940.326137136" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.159803 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" podStartSLOduration=3.158585045 podStartE2EDuration="15.159786091s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.045205267 +0000 UTC m=+927.237723131" lastFinishedPulling="2026-02-19 21:43:48.046406303 +0000 UTC m=+939.238924177" observedRunningTime="2026-02-19 21:43:49.155778238 +0000 UTC m=+940.348296102" watchObservedRunningTime="2026-02-19 21:43:49.159786091 +0000 UTC m=+940.352303955" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.206140 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" podStartSLOduration=3.068283165 podStartE2EDuration="15.206124181s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.926508673 +0000 UTC m=+927.119026537" lastFinishedPulling="2026-02-19 21:43:48.064349669 +0000 UTC m=+939.256867553" observedRunningTime="2026-02-19 21:43:49.200472361 +0000 UTC m=+940.392990225" watchObservedRunningTime="2026-02-19 21:43:49.206124181 +0000 UTC m=+940.398642045" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.216877 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" podStartSLOduration=2.696880649 podStartE2EDuration="15.216864714s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.448603239 +0000 UTC m=+926.641121103" lastFinishedPulling="2026-02-19 21:43:47.968587294 +0000 UTC m=+939.161105168" observedRunningTime="2026-02-19 21:43:49.216046181 +0000 UTC m=+940.408564045" watchObservedRunningTime="2026-02-19 21:43:49.216864714 +0000 UTC m=+940.409382578" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.238806 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" podStartSLOduration=3.016600454 podStartE2EDuration="15.238790354s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.746377803 +0000 UTC m=+926.938895667" lastFinishedPulling="2026-02-19 21:43:47.968567683 +0000 UTC m=+939.161085567" observedRunningTime="2026-02-19 21:43:49.235717437 +0000 UTC m=+940.428235301" watchObservedRunningTime="2026-02-19 21:43:49.238790354 +0000 UTC m=+940.431308228" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.281400 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" podStartSLOduration=2.683868941 podStartE2EDuration="15.281387307s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.448431114 +0000 UTC m=+926.640948978" lastFinishedPulling="2026-02-19 21:43:48.04594948 +0000 UTC m=+939.238467344" observedRunningTime="2026-02-19 21:43:49.277800326 +0000 UTC m=+940.470318190" watchObservedRunningTime="2026-02-19 21:43:49.281387307 +0000 UTC m=+940.473905171" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.297829 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" podStartSLOduration=2.718454608 podStartE2EDuration="15.297812801s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.484968076 +0000 UTC m=+926.677485940" lastFinishedPulling="2026-02-19 21:43:48.064326259 +0000 UTC m=+939.256844133" observedRunningTime="2026-02-19 21:43:49.296141544 +0000 UTC m=+940.488659408" watchObservedRunningTime="2026-02-19 21:43:49.297812801 +0000 UTC m=+940.490330665" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.319577 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" podStartSLOduration=2.970778369 podStartE2EDuration="15.319562846s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:35.697404979 +0000 UTC m=+926.889922843" lastFinishedPulling="2026-02-19 21:43:48.046189416 +0000 UTC m=+939.238707320" observedRunningTime="2026-02-19 21:43:49.316629253 +0000 UTC m=+940.509147117" watchObservedRunningTime="2026-02-19 21:43:49.319562846 +0000 UTC m=+940.512080710" Feb 19 21:43:49 crc kubenswrapper[4795]: I0219 21:43:49.979345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4g6z" event={"ID":"487e5483-759b-49c8-a347-f9a3ecd255ff","Type":"ContainerStarted","Data":"1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb"} Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.178278 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.183982 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e80963b-888b-4bb9-9259-864e38dd10ed-cert\") pod \"infra-operator-controller-manager-79d975b745-qjgvw\" (UID: \"2e80963b-888b-4bb9-9259-864e38dd10ed\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.270479 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.484422 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:43:50 crc kubenswrapper[4795]: E0219 21:43:50.484606 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:50 crc kubenswrapper[4795]: E0219 21:43:50.484800 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert podName:26db9cb2-1ed4-44e4-afac-404ce0f7d445 nodeName:}" failed. No retries permitted until 2026-02-19 21:44:06.484784641 +0000 UTC m=+957.677302505 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" (UID: "26db9cb2-1ed4-44e4-afac-404ce0f7d445") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.616462 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw"] Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.787845 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:50 crc kubenswrapper[4795]: E0219 21:43:50.788005 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 21:43:50 crc kubenswrapper[4795]: E0219 21:43:50.788053 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:44:06.78803922 +0000 UTC m=+957.980557084 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "metrics-server-cert" not found Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.889069 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:43:50 crc kubenswrapper[4795]: E0219 21:43:50.889305 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 21:43:50 crc kubenswrapper[4795]: E0219 21:43:50.889692 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs podName:9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:44:06.889596819 +0000 UTC m=+958.082114693 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-62xdp" (UID: "9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4") : secret "webhook-server-cert" not found Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.988218 4795 generic.go:334] "Generic (PLEG): container finished" podID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerID="1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb" exitCode=0 Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.988335 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4g6z" event={"ID":"487e5483-759b-49c8-a347-f9a3ecd255ff","Type":"ContainerDied","Data":"1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb"} Feb 19 21:43:50 crc kubenswrapper[4795]: I0219 21:43:50.990159 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" event={"ID":"2e80963b-888b-4bb9-9259-864e38dd10ed","Type":"ContainerStarted","Data":"a1e7328f20cbd1507b935f73a351364a0e5bb3a019e4d75306f8575fdfea39ab"} Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.454698 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-vwgdm" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.461530 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-z7hnk" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.472329 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-d8wqs" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.486595 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-5cnjr" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.518109 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-shb4d" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.630938 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-fdd85" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.684862 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7n98g" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.845865 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4cf2p" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.870702 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5b89b" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.882584 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ckxlw" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.994705 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-t87bb" Feb 19 21:43:54 crc kubenswrapper[4795]: I0219 21:43:54.999725 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-bwsj2" Feb 19 21:43:55 crc kubenswrapper[4795]: I0219 21:43:55.038122 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-t6hpt" Feb 19 21:43:55 crc kubenswrapper[4795]: I0219 21:43:55.117715 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-mr8mh" Feb 19 21:43:58 crc kubenswrapper[4795]: I0219 21:43:58.427403 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:43:58 crc kubenswrapper[4795]: I0219 21:43:58.427815 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:43:58 crc kubenswrapper[4795]: I0219 21:43:58.427869 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:43:58 crc kubenswrapper[4795]: I0219 21:43:58.428571 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"baca31a8ff8f8b420ab1c2fee031dade1a9efccb0543c74090535aa06f41da2f"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:43:58 crc kubenswrapper[4795]: I0219 21:43:58.428630 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://baca31a8ff8f8b420ab1c2fee031dade1a9efccb0543c74090535aa06f41da2f" gracePeriod=600 Feb 19 21:43:59 crc kubenswrapper[4795]: I0219 21:43:59.051968 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="baca31a8ff8f8b420ab1c2fee031dade1a9efccb0543c74090535aa06f41da2f" exitCode=0 Feb 19 21:43:59 crc kubenswrapper[4795]: I0219 21:43:59.052011 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"baca31a8ff8f8b420ab1c2fee031dade1a9efccb0543c74090535aa06f41da2f"} Feb 19 21:43:59 crc kubenswrapper[4795]: I0219 21:43:59.052078 4795 scope.go:117] "RemoveContainer" containerID="01e410588eeb6332f2520524efa20c5c33620bd277e99c49543b745d9bb370ca" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.073543 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"26a524895f2f97a5543d7713a3a6fad00cc54e588f3f27507aef436c0255d593"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.074764 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" event={"ID":"98979ac7-9fb1-49f8-8022-562082fc76f7","Type":"ContainerStarted","Data":"e4bca9f921c13624f7c2aba14785389b4a14b501a78bfad5b5f4bb7fecf2b466"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.074983 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.077682 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4g6z" event={"ID":"487e5483-759b-49c8-a347-f9a3ecd255ff","Type":"ContainerStarted","Data":"68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.078819 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" event={"ID":"e0cad59b-249e-446f-b3fa-6be8aac2a858","Type":"ContainerStarted","Data":"3b4157cb2d08c22628714a0483afb6095bb656e52ad9b2d2eb7a3cbb90e1c43b"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.079002 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.080113 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" event={"ID":"09ce2dcf-0fb0-4180-a019-09d1abfec00e","Type":"ContainerStarted","Data":"6cf8278b4ab796089e65ffdca64dd3ecc12d07ec0237c81f61fe05b3ac9f0c5b"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.080281 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.081513 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" event={"ID":"6bdc9c62-d8c1-42d5-8696-324fdc7abc2f","Type":"ContainerStarted","Data":"d6f9cbbb1dc0897152ab85873bc4df76698a9562a8824a4b9423d119871fc1d4"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.081680 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.082995 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" event={"ID":"80ce3bc1-0926-47a3-acc2-6f2d8be4089c","Type":"ContainerStarted","Data":"baf284142362a8de2fdf82336a2b8a11e5b36283b841faa03f93e1ece324cda5"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.087776 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" event={"ID":"5f4d8698-27a0-44a4-87f6-c75d4c3407bc","Type":"ContainerStarted","Data":"b7d243a92321064168826ed57751ca34bd39704c96d5da3b04fc409d20ff5abf"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.088020 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.090206 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" event={"ID":"2e80963b-888b-4bb9-9259-864e38dd10ed","Type":"ContainerStarted","Data":"006c821325cef15c64e5965179b893fd15a5c4dfc495c8c3b5b6d2c356c673bc"} Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.090364 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.109978 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" podStartSLOduration=3.988492526 podStartE2EDuration="28.109960892s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.20388255 +0000 UTC m=+927.396400424" lastFinishedPulling="2026-02-19 21:44:00.325350926 +0000 UTC m=+951.517868790" observedRunningTime="2026-02-19 21:44:02.104411045 +0000 UTC m=+953.296928909" watchObservedRunningTime="2026-02-19 21:44:02.109960892 +0000 UTC m=+953.302478766" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.120098 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vv89z" podStartSLOduration=1.941594075 podStartE2EDuration="27.120081468s" podCreationTimestamp="2026-02-19 21:43:35 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.216018663 +0000 UTC m=+927.408536527" lastFinishedPulling="2026-02-19 21:44:01.394506016 +0000 UTC m=+952.587023920" observedRunningTime="2026-02-19 21:44:02.11697688 +0000 UTC m=+953.309494744" watchObservedRunningTime="2026-02-19 21:44:02.120081468 +0000 UTC m=+953.312599332" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.147272 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" podStartSLOduration=2.783707705 podStartE2EDuration="28.147254806s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.207857743 +0000 UTC m=+927.400375607" lastFinishedPulling="2026-02-19 21:44:01.571404844 +0000 UTC m=+952.763922708" observedRunningTime="2026-02-19 21:44:02.140492775 +0000 UTC m=+953.333010649" watchObservedRunningTime="2026-02-19 21:44:02.147254806 +0000 UTC m=+953.339772670" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.163261 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k4g6z" podStartSLOduration=12.530440925 podStartE2EDuration="25.163247658s" podCreationTimestamp="2026-02-19 21:43:37 +0000 UTC" firstStartedPulling="2026-02-19 21:43:48.936874643 +0000 UTC m=+940.129392507" lastFinishedPulling="2026-02-19 21:44:01.569681376 +0000 UTC m=+952.762199240" observedRunningTime="2026-02-19 21:44:02.159344307 +0000 UTC m=+953.351862171" watchObservedRunningTime="2026-02-19 21:44:02.163247658 +0000 UTC m=+953.355765522" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.190784 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" podStartSLOduration=3.006072707 podStartE2EDuration="28.190767795s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.20987874 +0000 UTC m=+927.402396604" lastFinishedPulling="2026-02-19 21:44:01.394573838 +0000 UTC m=+952.587091692" observedRunningTime="2026-02-19 21:44:02.188558543 +0000 UTC m=+953.381076407" watchObservedRunningTime="2026-02-19 21:44:02.190767795 +0000 UTC m=+953.383285659" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.238272 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" podStartSLOduration=2.907060849 podStartE2EDuration="28.238258017s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.190246195 +0000 UTC m=+927.382764059" lastFinishedPulling="2026-02-19 21:44:01.521443363 +0000 UTC m=+952.713961227" observedRunningTime="2026-02-19 21:44:02.218701514 +0000 UTC m=+953.411219378" watchObservedRunningTime="2026-02-19 21:44:02.238258017 +0000 UTC m=+953.430775871" Feb 19 21:44:02 crc kubenswrapper[4795]: I0219 21:44:02.241029 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" podStartSLOduration=17.408283966 podStartE2EDuration="28.241019925s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:50.630761286 +0000 UTC m=+941.823279150" lastFinishedPulling="2026-02-19 21:44:01.463497245 +0000 UTC m=+952.656015109" observedRunningTime="2026-02-19 21:44:02.235283483 +0000 UTC m=+953.427801347" watchObservedRunningTime="2026-02-19 21:44:02.241019925 +0000 UTC m=+953.433537789" Feb 19 21:44:06 crc kubenswrapper[4795]: I0219 21:44:06.553131 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:44:06 crc kubenswrapper[4795]: I0219 21:44:06.564870 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/26db9cb2-1ed4-44e4-afac-404ce0f7d445-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p\" (UID: \"26db9cb2-1ed4-44e4-afac-404ce0f7d445\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:44:06 crc kubenswrapper[4795]: I0219 21:44:06.693394 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:44:06 crc kubenswrapper[4795]: I0219 21:44:06.858156 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:44:06 crc kubenswrapper[4795]: I0219 21:44:06.864369 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:44:06 crc kubenswrapper[4795]: I0219 21:44:06.963716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:44:06 crc kubenswrapper[4795]: I0219 21:44:06.969335 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-62xdp\" (UID: \"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:44:07 crc kubenswrapper[4795]: I0219 21:44:07.032400 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" podStartSLOduration=7.652555337 podStartE2EDuration="33.032382009s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:43:36.108745572 +0000 UTC m=+927.301263436" lastFinishedPulling="2026-02-19 21:44:01.488572244 +0000 UTC m=+952.681090108" observedRunningTime="2026-02-19 21:44:02.252036226 +0000 UTC m=+953.444554090" watchObservedRunningTime="2026-02-19 21:44:07.032382009 +0000 UTC m=+958.224899873" Feb 19 21:44:07 crc kubenswrapper[4795]: I0219 21:44:07.037393 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p"] Feb 19 21:44:07 crc kubenswrapper[4795]: I0219 21:44:07.076396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:44:07 crc kubenswrapper[4795]: I0219 21:44:07.132221 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" event={"ID":"26db9cb2-1ed4-44e4-afac-404ce0f7d445","Type":"ContainerStarted","Data":"cc7f0da59fb39ccd0431184ae2e66e0a24fcec4aec550587684ec43bebe26d89"} Feb 19 21:44:07 crc kubenswrapper[4795]: I0219 21:44:07.273696 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp"] Feb 19 21:44:07 crc kubenswrapper[4795]: W0219 21:44:07.281870 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f7c4d4b_cf83_47e2_a75f_e3a2c9658bb4.slice/crio-50a7097d3930b132c8d79e0393eacea33918b0628bd251c142fb763dcc3fb49e WatchSource:0}: Error finding container 50a7097d3930b132c8d79e0393eacea33918b0628bd251c142fb763dcc3fb49e: Status 404 returned error can't find the container with id 50a7097d3930b132c8d79e0393eacea33918b0628bd251c142fb763dcc3fb49e Feb 19 21:44:07 crc kubenswrapper[4795]: I0219 21:44:07.833238 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:44:07 crc kubenswrapper[4795]: I0219 21:44:07.833619 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:44:07 crc kubenswrapper[4795]: I0219 21:44:07.876241 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:44:08 crc kubenswrapper[4795]: I0219 21:44:08.144748 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" event={"ID":"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4","Type":"ContainerStarted","Data":"9d22271ac238e5923b60f1ca27d59155cb263d3bc580bed48248b3df363381b5"} Feb 19 21:44:08 crc kubenswrapper[4795]: I0219 21:44:08.144798 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" event={"ID":"9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4","Type":"ContainerStarted","Data":"50a7097d3930b132c8d79e0393eacea33918b0628bd251c142fb763dcc3fb49e"} Feb 19 21:44:08 crc kubenswrapper[4795]: I0219 21:44:08.176799 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" podStartSLOduration=34.176776806 podStartE2EDuration="34.176776806s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:44:08.168277515 +0000 UTC m=+959.360795389" watchObservedRunningTime="2026-02-19 21:44:08.176776806 +0000 UTC m=+959.369294670" Feb 19 21:44:08 crc kubenswrapper[4795]: I0219 21:44:08.188518 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:44:08 crc kubenswrapper[4795]: I0219 21:44:08.240848 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4g6z"] Feb 19 21:44:09 crc kubenswrapper[4795]: I0219 21:44:09.152965 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" event={"ID":"26db9cb2-1ed4-44e4-afac-404ce0f7d445","Type":"ContainerStarted","Data":"ad9d2127a6df34b718fac8fe15e8d0f55a9788d56a57d4eaa932cc52e7b52096"} Feb 19 21:44:09 crc kubenswrapper[4795]: I0219 21:44:09.153053 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:44:09 crc kubenswrapper[4795]: I0219 21:44:09.154799 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:44:09 crc kubenswrapper[4795]: I0219 21:44:09.188059 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" podStartSLOduration=33.524170855 podStartE2EDuration="35.18803245s" podCreationTimestamp="2026-02-19 21:43:34 +0000 UTC" firstStartedPulling="2026-02-19 21:44:07.044132991 +0000 UTC m=+958.236650855" lastFinishedPulling="2026-02-19 21:44:08.707994546 +0000 UTC m=+959.900512450" observedRunningTime="2026-02-19 21:44:09.180048514 +0000 UTC m=+960.372566408" watchObservedRunningTime="2026-02-19 21:44:09.18803245 +0000 UTC m=+960.380550314" Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.159258 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k4g6z" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerName="registry-server" containerID="cri-o://68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab" gracePeriod=2 Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.279958 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-qjgvw" Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.570449 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.624052 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-utilities\") pod \"487e5483-759b-49c8-a347-f9a3ecd255ff\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.624777 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm6kz\" (UniqueName: \"kubernetes.io/projected/487e5483-759b-49c8-a347-f9a3ecd255ff-kube-api-access-vm6kz\") pod \"487e5483-759b-49c8-a347-f9a3ecd255ff\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.624813 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-catalog-content\") pod \"487e5483-759b-49c8-a347-f9a3ecd255ff\" (UID: \"487e5483-759b-49c8-a347-f9a3ecd255ff\") " Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.625016 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-utilities" (OuterVolumeSpecName: "utilities") pod "487e5483-759b-49c8-a347-f9a3ecd255ff" (UID: "487e5483-759b-49c8-a347-f9a3ecd255ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.625371 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.638467 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487e5483-759b-49c8-a347-f9a3ecd255ff-kube-api-access-vm6kz" (OuterVolumeSpecName: "kube-api-access-vm6kz") pod "487e5483-759b-49c8-a347-f9a3ecd255ff" (UID: "487e5483-759b-49c8-a347-f9a3ecd255ff"). InnerVolumeSpecName "kube-api-access-vm6kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.654101 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "487e5483-759b-49c8-a347-f9a3ecd255ff" (UID: "487e5483-759b-49c8-a347-f9a3ecd255ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.726655 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm6kz\" (UniqueName: \"kubernetes.io/projected/487e5483-759b-49c8-a347-f9a3ecd255ff-kube-api-access-vm6kz\") on node \"crc\" DevicePath \"\"" Feb 19 21:44:10 crc kubenswrapper[4795]: I0219 21:44:10.726696 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/487e5483-759b-49c8-a347-f9a3ecd255ff-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.166314 4795 generic.go:334] "Generic (PLEG): container finished" podID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerID="68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab" exitCode=0 Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.166356 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4g6z" event={"ID":"487e5483-759b-49c8-a347-f9a3ecd255ff","Type":"ContainerDied","Data":"68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab"} Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.166382 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k4g6z" event={"ID":"487e5483-759b-49c8-a347-f9a3ecd255ff","Type":"ContainerDied","Data":"bd29711f0be87b24f29b013ce08c362b1111de6a1cccb18d373555addaf3b5a6"} Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.166399 4795 scope.go:117] "RemoveContainer" containerID="68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.166456 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k4g6z" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.189017 4795 scope.go:117] "RemoveContainer" containerID="1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.204158 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4g6z"] Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.212198 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k4g6z"] Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.232480 4795 scope.go:117] "RemoveContainer" containerID="1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.252501 4795 scope.go:117] "RemoveContainer" containerID="68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab" Feb 19 21:44:11 crc kubenswrapper[4795]: E0219 21:44:11.252996 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab\": container with ID starting with 68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab not found: ID does not exist" containerID="68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.253068 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab"} err="failed to get container status \"68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab\": rpc error: code = NotFound desc = could not find container \"68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab\": container with ID starting with 68160f45c546f829b363d820c59fae78a8f1b1106eedaa437c9437857e2b4cab not found: ID does not exist" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.253093 4795 scope.go:117] "RemoveContainer" containerID="1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb" Feb 19 21:44:11 crc kubenswrapper[4795]: E0219 21:44:11.253456 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb\": container with ID starting with 1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb not found: ID does not exist" containerID="1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.253517 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb"} err="failed to get container status \"1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb\": rpc error: code = NotFound desc = could not find container \"1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb\": container with ID starting with 1dc2b16630ee0b1fd4308fd6f7f8f7e036a08702c8a58c1dc3063c3b886c95cb not found: ID does not exist" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.253559 4795 scope.go:117] "RemoveContainer" containerID="1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d" Feb 19 21:44:11 crc kubenswrapper[4795]: E0219 21:44:11.253973 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d\": container with ID starting with 1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d not found: ID does not exist" containerID="1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.254027 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d"} err="failed to get container status \"1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d\": rpc error: code = NotFound desc = could not find container \"1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d\": container with ID starting with 1899fef31d4fa94097bca46bd05c65b6b55ad2778a2534f09b28eef318d77d2d not found: ID does not exist" Feb 19 21:44:11 crc kubenswrapper[4795]: I0219 21:44:11.519724 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" path="/var/lib/kubelet/pods/487e5483-759b-49c8-a347-f9a3ecd255ff/volumes" Feb 19 21:44:15 crc kubenswrapper[4795]: I0219 21:44:15.187320 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dqpjx" Feb 19 21:44:15 crc kubenswrapper[4795]: I0219 21:44:15.203701 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-slj65" Feb 19 21:44:15 crc kubenswrapper[4795]: I0219 21:44:15.230843 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-rcjgz" Feb 19 21:44:15 crc kubenswrapper[4795]: I0219 21:44:15.249292 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-slqxz" Feb 19 21:44:15 crc kubenswrapper[4795]: I0219 21:44:15.261986 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-bbtgm" Feb 19 21:44:16 crc kubenswrapper[4795]: I0219 21:44:16.699343 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p" Feb 19 21:44:17 crc kubenswrapper[4795]: I0219 21:44:17.085203 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-62xdp" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.597156 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xj6c9"] Feb 19 21:44:33 crc kubenswrapper[4795]: E0219 21:44:33.598067 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerName="extract-utilities" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.598085 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerName="extract-utilities" Feb 19 21:44:33 crc kubenswrapper[4795]: E0219 21:44:33.598101 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerName="registry-server" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.598111 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerName="registry-server" Feb 19 21:44:33 crc kubenswrapper[4795]: E0219 21:44:33.598131 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerName="extract-content" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.598140 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerName="extract-content" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.598331 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="487e5483-759b-49c8-a347-f9a3ecd255ff" containerName="registry-server" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.599330 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.604142 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.604485 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-8lfjr" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.604734 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.604976 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.609725 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xj6c9"] Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.648691 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40671af3-115e-495a-bdf5-34580fffdc69-config\") pod \"dnsmasq-dns-855cbc58c5-xj6c9\" (UID: \"40671af3-115e-495a-bdf5-34580fffdc69\") " pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.648735 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pvqh\" (UniqueName: \"kubernetes.io/projected/40671af3-115e-495a-bdf5-34580fffdc69-kube-api-access-2pvqh\") pod \"dnsmasq-dns-855cbc58c5-xj6c9\" (UID: \"40671af3-115e-495a-bdf5-34580fffdc69\") " pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.694049 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-ppz5s"] Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.695760 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.704321 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.712493 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-ppz5s"] Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.749985 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40671af3-115e-495a-bdf5-34580fffdc69-config\") pod \"dnsmasq-dns-855cbc58c5-xj6c9\" (UID: \"40671af3-115e-495a-bdf5-34580fffdc69\") " pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.750030 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pvqh\" (UniqueName: \"kubernetes.io/projected/40671af3-115e-495a-bdf5-34580fffdc69-kube-api-access-2pvqh\") pod \"dnsmasq-dns-855cbc58c5-xj6c9\" (UID: \"40671af3-115e-495a-bdf5-34580fffdc69\") " pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.750063 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.750079 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnfdk\" (UniqueName: \"kubernetes.io/projected/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-kube-api-access-tnfdk\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.750123 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-config\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.750921 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40671af3-115e-495a-bdf5-34580fffdc69-config\") pod \"dnsmasq-dns-855cbc58c5-xj6c9\" (UID: \"40671af3-115e-495a-bdf5-34580fffdc69\") " pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.782944 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pvqh\" (UniqueName: \"kubernetes.io/projected/40671af3-115e-495a-bdf5-34580fffdc69-kube-api-access-2pvqh\") pod \"dnsmasq-dns-855cbc58c5-xj6c9\" (UID: \"40671af3-115e-495a-bdf5-34580fffdc69\") " pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.851661 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-config\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.851769 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.851789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnfdk\" (UniqueName: \"kubernetes.io/projected/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-kube-api-access-tnfdk\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.852913 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.853811 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-config\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.870418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnfdk\" (UniqueName: \"kubernetes.io/projected/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-kube-api-access-tnfdk\") pod \"dnsmasq-dns-6fcf94d689-ppz5s\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:33 crc kubenswrapper[4795]: I0219 21:44:33.916104 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:44:34 crc kubenswrapper[4795]: I0219 21:44:34.019597 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:44:34 crc kubenswrapper[4795]: I0219 21:44:34.276315 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-ppz5s"] Feb 19 21:44:34 crc kubenswrapper[4795]: I0219 21:44:34.333394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" event={"ID":"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e","Type":"ContainerStarted","Data":"d707f3ffd06ea35995047e01b9f2f18e202cdb98a74e19c73116f0e7ffea06b9"} Feb 19 21:44:34 crc kubenswrapper[4795]: I0219 21:44:34.355715 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xj6c9"] Feb 19 21:44:34 crc kubenswrapper[4795]: W0219 21:44:34.362553 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40671af3_115e_495a_bdf5_34580fffdc69.slice/crio-d65b0d756b7de08be8b702a4b9a1293091aada93e842a06930928d0056a27a42 WatchSource:0}: Error finding container d65b0d756b7de08be8b702a4b9a1293091aada93e842a06930928d0056a27a42: Status 404 returned error can't find the container with id d65b0d756b7de08be8b702a4b9a1293091aada93e842a06930928d0056a27a42 Feb 19 21:44:35 crc kubenswrapper[4795]: I0219 21:44:35.343730 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" event={"ID":"40671af3-115e-495a-bdf5-34580fffdc69","Type":"ContainerStarted","Data":"d65b0d756b7de08be8b702a4b9a1293091aada93e842a06930928d0056a27a42"} Feb 19 21:44:35 crc kubenswrapper[4795]: I0219 21:44:35.906070 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-ppz5s"] Feb 19 21:44:35 crc kubenswrapper[4795]: I0219 21:44:35.929052 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-9rmlj"] Feb 19 21:44:35 crc kubenswrapper[4795]: I0219 21:44:35.968511 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-9rmlj"] Feb 19 21:44:35 crc kubenswrapper[4795]: I0219 21:44:35.968651 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:35 crc kubenswrapper[4795]: I0219 21:44:35.980485 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zdpq\" (UniqueName: \"kubernetes.io/projected/880246d9-9662-47e8-a0ff-5d2aca6de029-kube-api-access-2zdpq\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:35 crc kubenswrapper[4795]: I0219 21:44:35.980625 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-config\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:35 crc kubenswrapper[4795]: I0219 21:44:35.980676 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-dns-svc\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.081694 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-config\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.081779 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-dns-svc\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.081841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zdpq\" (UniqueName: \"kubernetes.io/projected/880246d9-9662-47e8-a0ff-5d2aca6de029-kube-api-access-2zdpq\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.083300 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-config\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.083342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-dns-svc\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.107958 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zdpq\" (UniqueName: \"kubernetes.io/projected/880246d9-9662-47e8-a0ff-5d2aca6de029-kube-api-access-2zdpq\") pod \"dnsmasq-dns-f54874ffc-9rmlj\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.293044 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.455142 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xj6c9"] Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.490614 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-74c7x"] Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.493621 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.495153 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-74c7x"] Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.592979 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss4wh\" (UniqueName: \"kubernetes.io/projected/5cb7920e-685c-4bb7-b276-3bf902251bd7-kube-api-access-ss4wh\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.593374 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-config\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.593447 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-dns-svc\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.694390 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-dns-svc\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.694512 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss4wh\" (UniqueName: \"kubernetes.io/projected/5cb7920e-685c-4bb7-b276-3bf902251bd7-kube-api-access-ss4wh\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.694623 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-config\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.695388 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-config\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.695863 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-dns-svc\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.722099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss4wh\" (UniqueName: \"kubernetes.io/projected/5cb7920e-685c-4bb7-b276-3bf902251bd7-kube-api-access-ss4wh\") pod \"dnsmasq-dns-67ff45466c-74c7x\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.808848 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-9rmlj"] Feb 19 21:44:36 crc kubenswrapper[4795]: I0219 21:44:36.823380 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:44:36 crc kubenswrapper[4795]: W0219 21:44:36.853864 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod880246d9_9662_47e8_a0ff_5d2aca6de029.slice/crio-236c5910103a0e9c6c7b85c7e035781b7dc5db0e7378799d12ebd2dd79bce1b2 WatchSource:0}: Error finding container 236c5910103a0e9c6c7b85c7e035781b7dc5db0e7378799d12ebd2dd79bce1b2: Status 404 returned error can't find the container with id 236c5910103a0e9c6c7b85c7e035781b7dc5db0e7378799d12ebd2dd79bce1b2 Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.068607 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.071825 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.074377 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.074509 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.084851 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.084893 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-pkz5l" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.085080 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.086390 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.087524 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.094297 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206137 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b096325-542d-4ac6-8d16-8aa0937013b2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206247 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206306 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206353 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206369 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cwvp\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-kube-api-access-7cwvp\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206408 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206440 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b096325-542d-4ac6-8d16-8aa0937013b2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206456 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206472 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206488 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.206521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307602 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b096325-542d-4ac6-8d16-8aa0937013b2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307647 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307668 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307688 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307832 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b096325-542d-4ac6-8d16-8aa0937013b2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307868 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307902 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307941 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cwvp\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-kube-api-access-7cwvp\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.307998 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.308280 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.308887 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.309561 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.309750 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.310054 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.310145 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.311734 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b096325-542d-4ac6-8d16-8aa0937013b2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.314347 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.314593 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.320520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b096325-542d-4ac6-8d16-8aa0937013b2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.325447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cwvp\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-kube-api-access-7cwvp\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.332019 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.342274 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-74c7x"] Feb 19 21:44:37 crc kubenswrapper[4795]: W0219 21:44:37.353692 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cb7920e_685c_4bb7_b276_3bf902251bd7.slice/crio-d22d6d60f193bcb6a68565e5f85094654e0386ed18eb92971e096bc7c18b8ee4 WatchSource:0}: Error finding container d22d6d60f193bcb6a68565e5f85094654e0386ed18eb92971e096bc7c18b8ee4: Status 404 returned error can't find the container with id d22d6d60f193bcb6a68565e5f85094654e0386ed18eb92971e096bc7c18b8ee4 Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.381593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" event={"ID":"880246d9-9662-47e8-a0ff-5d2aca6de029","Type":"ContainerStarted","Data":"236c5910103a0e9c6c7b85c7e035781b7dc5db0e7378799d12ebd2dd79bce1b2"} Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.383676 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-74c7x" event={"ID":"5cb7920e-685c-4bb7-b276-3bf902251bd7","Type":"ContainerStarted","Data":"d22d6d60f193bcb6a68565e5f85094654e0386ed18eb92971e096bc7c18b8ee4"} Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.440197 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.604385 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.605721 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.607267 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.610036 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.610046 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.610103 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.610984 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.611125 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.611932 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hqncs" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.630552 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.713587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.713648 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.713696 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.713723 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.713881 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.713922 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.713941 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.714003 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.714115 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.714193 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pjq4\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-kube-api-access-7pjq4\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.714238 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.815675 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.815749 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.815787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.815844 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.815881 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pjq4\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-kube-api-access-7pjq4\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.815950 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.815983 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.816036 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.816066 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.816105 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.816141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.816145 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.816426 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.816962 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.817048 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.817473 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.818010 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.822802 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.822851 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.823251 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.823498 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.832742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pjq4\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-kube-api-access-7pjq4\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.840578 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.933784 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:44:37 crc kubenswrapper[4795]: I0219 21:44:37.944674 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:44:38 crc kubenswrapper[4795]: I0219 21:44:38.399782 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b096325-542d-4ac6-8d16-8aa0937013b2","Type":"ContainerStarted","Data":"43ff46f740a6f7a342639c9893e1a10e76310ef799a0ad928eb028dabd7dd840"} Feb 19 21:44:38 crc kubenswrapper[4795]: I0219 21:44:38.447970 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.002246 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.003721 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.009102 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.009766 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-snswc" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.010008 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.010187 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.012299 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.032246 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.135892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.135941 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.135975 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtmqc\" (UniqueName: \"kubernetes.io/projected/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kube-api-access-gtmqc\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.135992 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-default\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.136016 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.136133 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.136214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kolla-config\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.136401 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.237810 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.237871 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtmqc\" (UniqueName: \"kubernetes.io/projected/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kube-api-access-gtmqc\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.237887 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-default\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.237909 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.237931 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.237950 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kolla-config\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.238070 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.238152 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.238725 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kolla-config\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.238935 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.239024 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-default\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.239101 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.239524 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.244253 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.253260 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.254560 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtmqc\" (UniqueName: \"kubernetes.io/projected/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kube-api-access-gtmqc\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.287235 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.329606 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.470602 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d","Type":"ContainerStarted","Data":"12b91da897daae78f76b09af510ceca04ac8909ff3967813b7e6274bf414c6a5"} Feb 19 21:44:39 crc kubenswrapper[4795]: I0219 21:44:39.987899 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.475646 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.477979 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.481883 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-pwwm8" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.482101 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.482366 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.482669 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.483566 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.492267 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99","Type":"ContainerStarted","Data":"050b9a153d584bbd1ba63be9e7a93c951075127827418493ce2ba5e1d8a7ed20"} Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.557233 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.557288 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.557317 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.557346 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.557419 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.557452 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.557476 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwqnm\" (UniqueName: \"kubernetes.io/projected/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kube-api-access-kwqnm\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.557548 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.659737 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.659783 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.659802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.659826 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.659846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.659894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.659924 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.659945 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwqnm\" (UniqueName: \"kubernetes.io/projected/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kube-api-access-kwqnm\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.660842 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.661099 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.661475 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.661631 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.661752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.669310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.678447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwqnm\" (UniqueName: \"kubernetes.io/projected/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kube-api-access-kwqnm\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.683057 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.698034 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.783440 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.784693 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.787353 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.787619 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-hptrp" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.787897 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.790941 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.844351 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.867496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-config-data\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.867566 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.867586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kolla-config\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.867610 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csp2v\" (UniqueName: \"kubernetes.io/projected/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kube-api-access-csp2v\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.867626 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.969292 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.969328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kolla-config\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.969353 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csp2v\" (UniqueName: \"kubernetes.io/projected/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kube-api-access-csp2v\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.969369 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.969444 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-config-data\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.970028 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-config-data\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.973966 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.974553 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kolla-config\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:40 crc kubenswrapper[4795]: I0219 21:44:40.979128 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:41 crc kubenswrapper[4795]: I0219 21:44:41.002357 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csp2v\" (UniqueName: \"kubernetes.io/projected/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kube-api-access-csp2v\") pod \"memcached-0\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " pod="openstack/memcached-0" Feb 19 21:44:41 crc kubenswrapper[4795]: I0219 21:44:41.100145 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 21:44:42 crc kubenswrapper[4795]: I0219 21:44:42.863734 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:44:42 crc kubenswrapper[4795]: I0219 21:44:42.864597 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:44:42 crc kubenswrapper[4795]: I0219 21:44:42.867749 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2gbt9" Feb 19 21:44:42 crc kubenswrapper[4795]: I0219 21:44:42.874759 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:44:42 crc kubenswrapper[4795]: I0219 21:44:42.896535 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr8kk\" (UniqueName: \"kubernetes.io/projected/bb3b374e-f01b-4997-9ecf-fbeeb384cc2c-kube-api-access-qr8kk\") pod \"kube-state-metrics-0\" (UID: \"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c\") " pod="openstack/kube-state-metrics-0" Feb 19 21:44:43 crc kubenswrapper[4795]: I0219 21:44:42.998353 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr8kk\" (UniqueName: \"kubernetes.io/projected/bb3b374e-f01b-4997-9ecf-fbeeb384cc2c-kube-api-access-qr8kk\") pod \"kube-state-metrics-0\" (UID: \"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c\") " pod="openstack/kube-state-metrics-0" Feb 19 21:44:43 crc kubenswrapper[4795]: I0219 21:44:43.020642 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr8kk\" (UniqueName: \"kubernetes.io/projected/bb3b374e-f01b-4997-9ecf-fbeeb384cc2c-kube-api-access-qr8kk\") pod \"kube-state-metrics-0\" (UID: \"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c\") " pod="openstack/kube-state-metrics-0" Feb 19 21:44:43 crc kubenswrapper[4795]: I0219 21:44:43.202213 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.159655 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.162463 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.167082 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.167099 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.167474 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.167581 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-pkmhz" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.172394 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.183919 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.264038 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czm58\" (UniqueName: \"kubernetes.io/projected/3c5a8678-8ce2-4bee-9160-37b1dea9f897-kube-api-access-czm58\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.264118 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.264280 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-config\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.264412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.264459 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.264577 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.264743 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.264788 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.276886 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-w9fbs"] Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.281751 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.285830 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.286155 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-877p5" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.286454 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.288974 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-tl5hf"] Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.290857 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.303342 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-w9fbs"] Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.308865 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tl5hf"] Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.367216 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czm58\" (UniqueName: \"kubernetes.io/projected/3c5a8678-8ce2-4bee-9160-37b1dea9f897-kube-api-access-czm58\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.367388 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-log\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.367454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-run\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.367482 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.367514 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.368376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.368685 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.368898 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-scripts\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369097 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkzbj\" (UniqueName: \"kubernetes.io/projected/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-kube-api-access-bkzbj\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369150 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-combined-ca-bundle\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-config\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369332 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-log-ovn\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369365 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369394 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp572\" (UniqueName: \"kubernetes.io/projected/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-kube-api-access-rp572\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-lib\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369760 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369804 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-scripts\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369841 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-ovn-controller-tls-certs\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369884 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run-ovn\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.369949 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-etc-ovs\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.371122 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-config\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.371846 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.379692 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.379692 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.387522 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czm58\" (UniqueName: \"kubernetes.io/projected/3c5a8678-8ce2-4bee-9160-37b1dea9f897-kube-api-access-czm58\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.388187 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.402899 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471094 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-run\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471154 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-scripts\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471201 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkzbj\" (UniqueName: \"kubernetes.io/projected/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-kube-api-access-bkzbj\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471220 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-combined-ca-bundle\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471250 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471279 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-log-ovn\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-lib\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471319 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp572\" (UniqueName: \"kubernetes.io/projected/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-kube-api-access-rp572\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471342 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-scripts\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-ovn-controller-tls-certs\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471388 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run-ovn\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471403 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-etc-ovs\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.471440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-log\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.472199 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-log\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.472385 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-lib\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.472502 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-log-ovn\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.472776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-run\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.473331 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.473495 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run-ovn\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.474709 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-scripts\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.474913 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-etc-ovs\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.475942 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-scripts\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.478005 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-ovn-controller-tls-certs\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.478183 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-combined-ca-bundle\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.487184 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.490807 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkzbj\" (UniqueName: \"kubernetes.io/projected/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-kube-api-access-bkzbj\") pod \"ovn-controller-w9fbs\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.496848 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp572\" (UniqueName: \"kubernetes.io/projected/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-kube-api-access-rp572\") pod \"ovn-controller-ovs-tl5hf\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.608917 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w9fbs" Feb 19 21:44:47 crc kubenswrapper[4795]: I0219 21:44:47.626481 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.046883 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.048042 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.053393 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.053758 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.054766 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-p6ptc" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.055403 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.069719 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.198756 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.198809 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-config\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.198840 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwcsv\" (UniqueName: \"kubernetes.io/projected/3c2bcb9c-07d3-4d71-924b-aacd537e3430-kube-api-access-fwcsv\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.198859 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.198878 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.199057 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.199271 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.199333 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300481 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300535 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-config\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300570 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwcsv\" (UniqueName: \"kubernetes.io/projected/3c2bcb9c-07d3-4d71-924b-aacd537e3430-kube-api-access-fwcsv\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300589 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300612 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300639 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300689 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300723 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.300967 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.301145 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.301776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-config\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.301967 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.310812 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.310827 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.310923 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.321083 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.322221 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwcsv\" (UniqueName: \"kubernetes.io/projected/3c2bcb9c-07d3-4d71-924b-aacd537e3430-kube-api-access-fwcsv\") pod \"ovsdbserver-sb-0\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:49 crc kubenswrapper[4795]: I0219 21:44:49.374313 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 21:44:53 crc kubenswrapper[4795]: E0219 21:44:53.027691 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76" Feb 19 21:44:53 crc kubenswrapper[4795]: E0219 21:44:53.028180 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7cwvp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(7b096325-542d-4ac6-8d16-8aa0937013b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:44:53 crc kubenswrapper[4795]: E0219 21:44:53.029395 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" Feb 19 21:44:53 crc kubenswrapper[4795]: E0219 21:44:53.595056 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:221c84e162c46ac7454de6fb84343d0a605f2ea1d7d5647a34a66569e0a8fd76\\\"\"" pod="openstack/rabbitmq-server-0" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.157982 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v"] Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.160083 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.163241 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.163949 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.164321 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v"] Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.202287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1627c007-5a7c-4fa5-a15f-0da43560c849-secret-volume\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.202352 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frbhn\" (UniqueName: \"kubernetes.io/projected/1627c007-5a7c-4fa5-a15f-0da43560c849-kube-api-access-frbhn\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.202857 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1627c007-5a7c-4fa5-a15f-0da43560c849-config-volume\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.304858 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1627c007-5a7c-4fa5-a15f-0da43560c849-secret-volume\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.304954 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frbhn\" (UniqueName: \"kubernetes.io/projected/1627c007-5a7c-4fa5-a15f-0da43560c849-kube-api-access-frbhn\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.305015 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1627c007-5a7c-4fa5-a15f-0da43560c849-config-volume\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.306281 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1627c007-5a7c-4fa5-a15f-0da43560c849-config-volume\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.314385 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1627c007-5a7c-4fa5-a15f-0da43560c849-secret-volume\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.322263 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frbhn\" (UniqueName: \"kubernetes.io/projected/1627c007-5a7c-4fa5-a15f-0da43560c849-kube-api-access-frbhn\") pod \"collect-profiles-29525625-kzr2v\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: I0219 21:45:00.494094 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:00 crc kubenswrapper[4795]: E0219 21:45:00.616477 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed" Feb 19 21:45:00 crc kubenswrapper[4795]: E0219 21:45:00.616666 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gtmqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(0bbc6c00-2fc9-42cb-9c5a-9a160903ae99): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:45:00 crc kubenswrapper[4795]: E0219 21:45:00.617897 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" Feb 19 21:45:00 crc kubenswrapper[4795]: E0219 21:45:00.659418 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed\\\"\"" pod="openstack/openstack-galera-0" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.510217 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.510365 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tnfdk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6fcf94d689-ppz5s_openstack(8f68f295-3b94-4e1f-8e9d-ba49ffe5198e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.511660 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" podUID="8f68f295-3b94-4e1f-8e9d-ba49ffe5198e" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.515664 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.515868 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ss4wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-67ff45466c-74c7x_openstack(5cb7920e-685c-4bb7-b276-3bf902251bd7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.517031 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-67ff45466c-74c7x" podUID="5cb7920e-685c-4bb7-b276-3bf902251bd7" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.547762 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.547927 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2pvqh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-855cbc58c5-xj6c9_openstack(40671af3-115e-495a-bdf5-34580fffdc69): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.550340 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" podUID="40671af3-115e-495a-bdf5-34580fffdc69" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.562350 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.562529 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2zdpq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-f54874ffc-9rmlj_openstack(880246d9-9662-47e8-a0ff-5d2aca6de029): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.564413 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" podUID="880246d9-9662-47e8-a0ff-5d2aca6de029" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.668127 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" podUID="880246d9-9662-47e8-a0ff-5d2aca6de029" Feb 19 21:45:01 crc kubenswrapper[4795]: E0219 21:45:01.668606 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-67ff45466c-74c7x" podUID="5cb7920e-685c-4bb7-b276-3bf902251bd7" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.093143 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.099578 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.133872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40671af3-115e-495a-bdf5-34580fffdc69-config\") pod \"40671af3-115e-495a-bdf5-34580fffdc69\" (UID: \"40671af3-115e-495a-bdf5-34580fffdc69\") " Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.133969 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-dns-svc\") pod \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.134006 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnfdk\" (UniqueName: \"kubernetes.io/projected/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-kube-api-access-tnfdk\") pod \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.134031 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pvqh\" (UniqueName: \"kubernetes.io/projected/40671af3-115e-495a-bdf5-34580fffdc69-kube-api-access-2pvqh\") pod \"40671af3-115e-495a-bdf5-34580fffdc69\" (UID: \"40671af3-115e-495a-bdf5-34580fffdc69\") " Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.134074 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-config\") pod \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\" (UID: \"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e\") " Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.134884 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-config" (OuterVolumeSpecName: "config") pod "8f68f295-3b94-4e1f-8e9d-ba49ffe5198e" (UID: "8f68f295-3b94-4e1f-8e9d-ba49ffe5198e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.135017 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-w9fbs"] Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.135322 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40671af3-115e-495a-bdf5-34580fffdc69-config" (OuterVolumeSpecName: "config") pod "40671af3-115e-495a-bdf5-34580fffdc69" (UID: "40671af3-115e-495a-bdf5-34580fffdc69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.137624 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8f68f295-3b94-4e1f-8e9d-ba49ffe5198e" (UID: "8f68f295-3b94-4e1f-8e9d-ba49ffe5198e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.138591 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.140625 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-kube-api-access-tnfdk" (OuterVolumeSpecName: "kube-api-access-tnfdk") pod "8f68f295-3b94-4e1f-8e9d-ba49ffe5198e" (UID: "8f68f295-3b94-4e1f-8e9d-ba49ffe5198e"). InnerVolumeSpecName "kube-api-access-tnfdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.144816 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40671af3-115e-495a-bdf5-34580fffdc69-kube-api-access-2pvqh" (OuterVolumeSpecName: "kube-api-access-2pvqh") pod "40671af3-115e-495a-bdf5-34580fffdc69" (UID: "40671af3-115e-495a-bdf5-34580fffdc69"). InnerVolumeSpecName "kube-api-access-2pvqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.236293 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40671af3-115e-495a-bdf5-34580fffdc69-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.236322 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.236332 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnfdk\" (UniqueName: \"kubernetes.io/projected/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-kube-api-access-tnfdk\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.236341 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pvqh\" (UniqueName: \"kubernetes.io/projected/40671af3-115e-495a-bdf5-34580fffdc69-kube-api-access-2pvqh\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.236352 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.285335 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v"] Feb 19 21:45:02 crc kubenswrapper[4795]: W0219 21:45:02.297360 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1627c007_5a7c_4fa5_a15f_0da43560c849.slice/crio-a0fb4a5ee44a5dda402292aff76ee26a3be9ac93d140292f388dd07787d2ee32 WatchSource:0}: Error finding container a0fb4a5ee44a5dda402292aff76ee26a3be9ac93d140292f388dd07787d2ee32: Status 404 returned error can't find the container with id a0fb4a5ee44a5dda402292aff76ee26a3be9ac93d140292f388dd07787d2ee32 Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.322757 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:45:02 crc kubenswrapper[4795]: W0219 21:45:02.335558 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb3b374e_f01b_4997_9ecf_fbeeb384cc2c.slice/crio-1c52434379a6a736b3ede83a285c9c03d25be4bf31ea4977955cffc137750ca3 WatchSource:0}: Error finding container 1c52434379a6a736b3ede83a285c9c03d25be4bf31ea4977955cffc137750ca3: Status 404 returned error can't find the container with id 1c52434379a6a736b3ede83a285c9c03d25be4bf31ea4977955cffc137750ca3 Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.343070 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.400833 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tl5hf"] Feb 19 21:45:02 crc kubenswrapper[4795]: W0219 21:45:02.407899 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a19676c_9314_43a3_a2f8_bcf56d6b5ce3.slice/crio-79efa7732f14d953bfab30c856bec07a21e8fbe6e42ebb6b0b9f4b604f334bb3 WatchSource:0}: Error finding container 79efa7732f14d953bfab30c856bec07a21e8fbe6e42ebb6b0b9f4b604f334bb3: Status 404 returned error can't find the container with id 79efa7732f14d953bfab30c856bec07a21e8fbe6e42ebb6b0b9f4b604f334bb3 Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.489556 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.609722 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.675510 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" event={"ID":"8f68f295-3b94-4e1f-8e9d-ba49ffe5198e","Type":"ContainerDied","Data":"d707f3ffd06ea35995047e01b9f2f18e202cdb98a74e19c73116f0e7ffea06b9"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.675541 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-ppz5s" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.676931 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3c2bcb9c-07d3-4d71-924b-aacd537e3430","Type":"ContainerStarted","Data":"0753bcc18c087ec61d4625b239ed921fd6b476f148310ba726f57a4cfa8d345c"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.677920 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tl5hf" event={"ID":"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3","Type":"ContainerStarted","Data":"79efa7732f14d953bfab30c856bec07a21e8fbe6e42ebb6b0b9f4b604f334bb3"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.679002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" event={"ID":"40671af3-115e-495a-bdf5-34580fffdc69","Type":"ContainerDied","Data":"d65b0d756b7de08be8b702a4b9a1293091aada93e842a06930928d0056a27a42"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.679043 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-xj6c9" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.680425 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" event={"ID":"1627c007-5a7c-4fa5-a15f-0da43560c849","Type":"ContainerStarted","Data":"6d51fcadf72cdef4da5baae1dcfd6cb93884c814c13eaf31041e3c1336f34a93"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.680454 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" event={"ID":"1627c007-5a7c-4fa5-a15f-0da43560c849","Type":"ContainerStarted","Data":"a0fb4a5ee44a5dda402292aff76ee26a3be9ac93d140292f388dd07787d2ee32"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.681505 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d","Type":"ContainerStarted","Data":"fa781215a57c5a384dc9196151cb9d88b19a59e6ec4219a4b6443b0c5d96ab8f"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.682407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3","Type":"ContainerStarted","Data":"aada954b6c8106a5c25613b1c4b96d76ce41049aa7128aa357d9511f84c5abf0"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.683583 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c","Type":"ContainerStarted","Data":"1c52434379a6a736b3ede83a285c9c03d25be4bf31ea4977955cffc137750ca3"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.685266 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w9fbs" event={"ID":"c30e8522-2d7f-4f10-a0b4-a7cfc351d093","Type":"ContainerStarted","Data":"dc18420d588bd541d274269ae096f1224bb6a914c81107d9d0d3602a4e7a25d2"} Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.702434 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" podStartSLOduration=2.702415031 podStartE2EDuration="2.702415031s" podCreationTimestamp="2026-02-19 21:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:02.699881309 +0000 UTC m=+1013.892399173" watchObservedRunningTime="2026-02-19 21:45:02.702415031 +0000 UTC m=+1013.894932895" Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.746563 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xj6c9"] Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.746607 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xj6c9"] Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.802680 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-ppz5s"] Feb 19 21:45:02 crc kubenswrapper[4795]: I0219 21:45:02.810084 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-ppz5s"] Feb 19 21:45:03 crc kubenswrapper[4795]: I0219 21:45:03.157326 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:45:03 crc kubenswrapper[4795]: W0219 21:45:03.439101 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c5a8678_8ce2_4bee_9160_37b1dea9f897.slice/crio-f3cf06abd80bf98f00707db38827bef51395b42a731dbd8362aaf2be900fcb70 WatchSource:0}: Error finding container f3cf06abd80bf98f00707db38827bef51395b42a731dbd8362aaf2be900fcb70: Status 404 returned error can't find the container with id f3cf06abd80bf98f00707db38827bef51395b42a731dbd8362aaf2be900fcb70 Feb 19 21:45:03 crc kubenswrapper[4795]: I0219 21:45:03.524380 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40671af3-115e-495a-bdf5-34580fffdc69" path="/var/lib/kubelet/pods/40671af3-115e-495a-bdf5-34580fffdc69/volumes" Feb 19 21:45:03 crc kubenswrapper[4795]: I0219 21:45:03.524826 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f68f295-3b94-4e1f-8e9d-ba49ffe5198e" path="/var/lib/kubelet/pods/8f68f295-3b94-4e1f-8e9d-ba49ffe5198e/volumes" Feb 19 21:45:03 crc kubenswrapper[4795]: I0219 21:45:03.697299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3c5a8678-8ce2-4bee-9160-37b1dea9f897","Type":"ContainerStarted","Data":"f3cf06abd80bf98f00707db38827bef51395b42a731dbd8362aaf2be900fcb70"} Feb 19 21:45:03 crc kubenswrapper[4795]: I0219 21:45:03.698978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d","Type":"ContainerStarted","Data":"026030d64ab176967289d43803e975c62df885e093dcaebf727c13a2d67da5ff"} Feb 19 21:45:03 crc kubenswrapper[4795]: I0219 21:45:03.702072 4795 generic.go:334] "Generic (PLEG): container finished" podID="1627c007-5a7c-4fa5-a15f-0da43560c849" containerID="6d51fcadf72cdef4da5baae1dcfd6cb93884c814c13eaf31041e3c1336f34a93" exitCode=0 Feb 19 21:45:03 crc kubenswrapper[4795]: I0219 21:45:03.702247 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" event={"ID":"1627c007-5a7c-4fa5-a15f-0da43560c849","Type":"ContainerDied","Data":"6d51fcadf72cdef4da5baae1dcfd6cb93884c814c13eaf31041e3c1336f34a93"} Feb 19 21:45:03 crc kubenswrapper[4795]: I0219 21:45:03.704679 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d","Type":"ContainerStarted","Data":"f5dc53fcd687359370a9224413921410e03027d27bb1e741143948af4422db6c"} Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.594762 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.730315 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1627c007-5a7c-4fa5-a15f-0da43560c849-config-volume\") pod \"1627c007-5a7c-4fa5-a15f-0da43560c849\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.730375 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frbhn\" (UniqueName: \"kubernetes.io/projected/1627c007-5a7c-4fa5-a15f-0da43560c849-kube-api-access-frbhn\") pod \"1627c007-5a7c-4fa5-a15f-0da43560c849\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.730596 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1627c007-5a7c-4fa5-a15f-0da43560c849-secret-volume\") pod \"1627c007-5a7c-4fa5-a15f-0da43560c849\" (UID: \"1627c007-5a7c-4fa5-a15f-0da43560c849\") " Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.731708 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1627c007-5a7c-4fa5-a15f-0da43560c849-config-volume" (OuterVolumeSpecName: "config-volume") pod "1627c007-5a7c-4fa5-a15f-0da43560c849" (UID: "1627c007-5a7c-4fa5-a15f-0da43560c849"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.735314 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1627c007-5a7c-4fa5-a15f-0da43560c849-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1627c007-5a7c-4fa5-a15f-0da43560c849" (UID: "1627c007-5a7c-4fa5-a15f-0da43560c849"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.736927 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" event={"ID":"1627c007-5a7c-4fa5-a15f-0da43560c849","Type":"ContainerDied","Data":"a0fb4a5ee44a5dda402292aff76ee26a3be9ac93d140292f388dd07787d2ee32"} Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.736965 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0fb4a5ee44a5dda402292aff76ee26a3be9ac93d140292f388dd07787d2ee32" Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.737022 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v" Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.737315 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1627c007-5a7c-4fa5-a15f-0da43560c849-kube-api-access-frbhn" (OuterVolumeSpecName: "kube-api-access-frbhn") pod "1627c007-5a7c-4fa5-a15f-0da43560c849" (UID: "1627c007-5a7c-4fa5-a15f-0da43560c849"). InnerVolumeSpecName "kube-api-access-frbhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.739207 4795 generic.go:334] "Generic (PLEG): container finished" podID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" containerID="026030d64ab176967289d43803e975c62df885e093dcaebf727c13a2d67da5ff" exitCode=0 Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.739238 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d","Type":"ContainerDied","Data":"026030d64ab176967289d43803e975c62df885e093dcaebf727c13a2d67da5ff"} Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.832897 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1627c007-5a7c-4fa5-a15f-0da43560c849-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.832935 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1627c007-5a7c-4fa5-a15f-0da43560c849-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:06 crc kubenswrapper[4795]: I0219 21:45:06.832949 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frbhn\" (UniqueName: \"kubernetes.io/projected/1627c007-5a7c-4fa5-a15f-0da43560c849-kube-api-access-frbhn\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:07 crc kubenswrapper[4795]: I0219 21:45:07.756708 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d","Type":"ContainerStarted","Data":"ae770d1785f3d5deaa5b1b98adf315fd8a61a2cbce434ecb3970ab496579b196"} Feb 19 21:45:07 crc kubenswrapper[4795]: I0219 21:45:07.759313 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3c2bcb9c-07d3-4d71-924b-aacd537e3430","Type":"ContainerStarted","Data":"fc2aa4e6ca186f0a3259cef3b73108248f8b30394d6e7f6ee3356a21235bd96b"} Feb 19 21:45:07 crc kubenswrapper[4795]: I0219 21:45:07.762788 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3","Type":"ContainerStarted","Data":"db08b177837bd80146274836c0977b72219e93219854a2c6c5e81a24922a33fe"} Feb 19 21:45:07 crc kubenswrapper[4795]: I0219 21:45:07.764016 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 19 21:45:07 crc kubenswrapper[4795]: I0219 21:45:07.766781 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 21:45:07 crc kubenswrapper[4795]: I0219 21:45:07.768155 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3c5a8678-8ce2-4bee-9160-37b1dea9f897","Type":"ContainerStarted","Data":"146b7776ad30bb62917c430a4cb976669fc9f5db740f7f370033e5be6e16f033"} Feb 19 21:45:07 crc kubenswrapper[4795]: I0219 21:45:07.782929 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=23.202192063 podStartE2EDuration="27.782870794s" podCreationTimestamp="2026-02-19 21:44:40 +0000 UTC" firstStartedPulling="2026-02-19 21:45:02.502001688 +0000 UTC m=+1013.694519552" lastFinishedPulling="2026-02-19 21:45:07.082680419 +0000 UTC m=+1018.275198283" observedRunningTime="2026-02-19 21:45:07.779422016 +0000 UTC m=+1018.971939880" watchObservedRunningTime="2026-02-19 21:45:07.782870794 +0000 UTC m=+1018.975388658" Feb 19 21:45:07 crc kubenswrapper[4795]: I0219 21:45:07.801986 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=20.692956864 podStartE2EDuration="25.801958023s" podCreationTimestamp="2026-02-19 21:44:42 +0000 UTC" firstStartedPulling="2026-02-19 21:45:02.339129786 +0000 UTC m=+1013.531647650" lastFinishedPulling="2026-02-19 21:45:07.448130955 +0000 UTC m=+1018.640648809" observedRunningTime="2026-02-19 21:45:07.794574714 +0000 UTC m=+1018.987092578" watchObservedRunningTime="2026-02-19 21:45:07.801958023 +0000 UTC m=+1018.994475887" Feb 19 21:45:08 crc kubenswrapper[4795]: I0219 21:45:08.778137 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w9fbs" event={"ID":"c30e8522-2d7f-4f10-a0b4-a7cfc351d093","Type":"ContainerStarted","Data":"e6a343acb1888ef7b6cc47d70fee6b276aa66bddee8c2a595c2dde665a4a1a27"} Feb 19 21:45:08 crc kubenswrapper[4795]: I0219 21:45:08.778542 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-w9fbs" Feb 19 21:45:08 crc kubenswrapper[4795]: I0219 21:45:08.781996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b096325-542d-4ac6-8d16-8aa0937013b2","Type":"ContainerStarted","Data":"90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55"} Feb 19 21:45:08 crc kubenswrapper[4795]: I0219 21:45:08.785971 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c","Type":"ContainerStarted","Data":"0c46b510d414f62d57f7afe61292d3a54a60ed7655ea208a609c0d72a9940824"} Feb 19 21:45:08 crc kubenswrapper[4795]: I0219 21:45:08.788314 4795 generic.go:334] "Generic (PLEG): container finished" podID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerID="000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde" exitCode=0 Feb 19 21:45:08 crc kubenswrapper[4795]: I0219 21:45:08.788404 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tl5hf" event={"ID":"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3","Type":"ContainerDied","Data":"000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde"} Feb 19 21:45:08 crc kubenswrapper[4795]: I0219 21:45:08.795662 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-w9fbs" podStartSLOduration=16.550970748 podStartE2EDuration="21.795646581s" podCreationTimestamp="2026-02-19 21:44:47 +0000 UTC" firstStartedPulling="2026-02-19 21:45:02.138302511 +0000 UTC m=+1013.330820405" lastFinishedPulling="2026-02-19 21:45:07.382978374 +0000 UTC m=+1018.575496238" observedRunningTime="2026-02-19 21:45:08.795398864 +0000 UTC m=+1019.987916728" watchObservedRunningTime="2026-02-19 21:45:08.795646581 +0000 UTC m=+1019.988164445" Feb 19 21:45:08 crc kubenswrapper[4795]: I0219 21:45:08.818835 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=29.320914657 podStartE2EDuration="29.818820815s" podCreationTimestamp="2026-02-19 21:44:39 +0000 UTC" firstStartedPulling="2026-02-19 21:45:02.340820664 +0000 UTC m=+1013.533338528" lastFinishedPulling="2026-02-19 21:45:02.838726822 +0000 UTC m=+1014.031244686" observedRunningTime="2026-02-19 21:45:08.813259618 +0000 UTC m=+1020.005777492" watchObservedRunningTime="2026-02-19 21:45:08.818820815 +0000 UTC m=+1020.011338679" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.814510 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3c2bcb9c-07d3-4d71-924b-aacd537e3430","Type":"ContainerStarted","Data":"e932ce1114c0c3a49c8f6332f06a4a9aadb5f1382200346f37f0e3e9fe2d3373"} Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.841877 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-p9cs4"] Feb 19 21:45:09 crc kubenswrapper[4795]: E0219 21:45:09.842322 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1627c007-5a7c-4fa5-a15f-0da43560c849" containerName="collect-profiles" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.842342 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1627c007-5a7c-4fa5-a15f-0da43560c849" containerName="collect-profiles" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.842564 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1627c007-5a7c-4fa5-a15f-0da43560c849" containerName="collect-profiles" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.843230 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.845661 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.848514 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-p9cs4"] Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.851574 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=14.927591584 podStartE2EDuration="21.851556277s" podCreationTimestamp="2026-02-19 21:44:48 +0000 UTC" firstStartedPulling="2026-02-19 21:45:02.597668751 +0000 UTC m=+1013.790186615" lastFinishedPulling="2026-02-19 21:45:09.521633404 +0000 UTC m=+1020.714151308" observedRunningTime="2026-02-19 21:45:09.844795526 +0000 UTC m=+1021.037313390" watchObservedRunningTime="2026-02-19 21:45:09.851556277 +0000 UTC m=+1021.044074131" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.854127 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tl5hf" event={"ID":"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3","Type":"ContainerStarted","Data":"e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23"} Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.862351 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3c5a8678-8ce2-4bee-9160-37b1dea9f897","Type":"ContainerStarted","Data":"9afa6d2d9c1c3abf2795f79e56d0f2c85700d5b1b8b011dec52725e8bbc63d6b"} Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.899736 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wdlj\" (UniqueName: \"kubernetes.io/projected/eee0ea5d-4b43-4421-b23e-555c5eac3564-kube-api-access-6wdlj\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.900498 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovs-rundir\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.900547 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovn-rundir\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.900588 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.900742 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-combined-ca-bundle\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.901751 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee0ea5d-4b43-4421-b23e-555c5eac3564-config\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.991077 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.919675076 podStartE2EDuration="23.991057378s" podCreationTimestamp="2026-02-19 21:44:46 +0000 UTC" firstStartedPulling="2026-02-19 21:45:03.442290407 +0000 UTC m=+1014.634808271" lastFinishedPulling="2026-02-19 21:45:09.513672699 +0000 UTC m=+1020.706190573" observedRunningTime="2026-02-19 21:45:09.912302293 +0000 UTC m=+1021.104820157" watchObservedRunningTime="2026-02-19 21:45:09.991057378 +0000 UTC m=+1021.183575242" Feb 19 21:45:09 crc kubenswrapper[4795]: I0219 21:45:09.995707 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-9rmlj"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.004353 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wdlj\" (UniqueName: \"kubernetes.io/projected/eee0ea5d-4b43-4421-b23e-555c5eac3564-kube-api-access-6wdlj\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.004395 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovs-rundir\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.004418 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovn-rundir\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.004445 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.004497 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-combined-ca-bundle\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.004512 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee0ea5d-4b43-4421-b23e-555c5eac3564-config\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.005514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee0ea5d-4b43-4421-b23e-555c5eac3564-config\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.005555 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovn-rundir\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.005885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovs-rundir\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.018416 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.018866 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-combined-ca-bundle\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.035311 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-sdlxp"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.037303 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.039204 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.041265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wdlj\" (UniqueName: \"kubernetes.io/projected/eee0ea5d-4b43-4421-b23e-555c5eac3564-kube-api-access-6wdlj\") pod \"ovn-controller-metrics-p9cs4\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.063972 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-sdlxp"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.112964 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-ovsdbserver-nb\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.113254 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2p7v\" (UniqueName: \"kubernetes.io/projected/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-kube-api-access-n2p7v\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.113364 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-config\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.113440 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-dns-svc\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.143141 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-74c7x"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.172053 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.175805 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-hk588"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.188917 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.190903 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.215863 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.215905 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.215955 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2p7v\" (UniqueName: \"kubernetes.io/projected/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-kube-api-access-n2p7v\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.215990 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xddmw\" (UniqueName: \"kubernetes.io/projected/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-kube-api-access-xddmw\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.216018 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-config\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.216051 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.216076 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-dns-svc\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.216108 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-ovsdbserver-nb\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.216139 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-config\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.220645 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-config\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.221132 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-dns-svc\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.222228 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-ovsdbserver-nb\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.239579 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-hk588"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.242855 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2p7v\" (UniqueName: \"kubernetes.io/projected/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-kube-api-access-n2p7v\") pod \"dnsmasq-dns-57bdd75c-sdlxp\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.329007 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-config\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.329118 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.329144 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.329244 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xddmw\" (UniqueName: \"kubernetes.io/projected/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-kube-api-access-xddmw\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.329319 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.330553 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.331183 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.331594 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-config\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.333741 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.348101 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.353016 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xddmw\" (UniqueName: \"kubernetes.io/projected/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-kube-api-access-xddmw\") pod \"dnsmasq-dns-75b7bcc64f-hk588\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.363033 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.374999 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.423482 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.430808 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-dns-svc\") pod \"880246d9-9662-47e8-a0ff-5d2aca6de029\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.430978 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zdpq\" (UniqueName: \"kubernetes.io/projected/880246d9-9662-47e8-a0ff-5d2aca6de029-kube-api-access-2zdpq\") pod \"880246d9-9662-47e8-a0ff-5d2aca6de029\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.431007 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-config\") pod \"880246d9-9662-47e8-a0ff-5d2aca6de029\" (UID: \"880246d9-9662-47e8-a0ff-5d2aca6de029\") " Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.433140 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-config" (OuterVolumeSpecName: "config") pod "880246d9-9662-47e8-a0ff-5d2aca6de029" (UID: "880246d9-9662-47e8-a0ff-5d2aca6de029"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.433642 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "880246d9-9662-47e8-a0ff-5d2aca6de029" (UID: "880246d9-9662-47e8-a0ff-5d2aca6de029"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.444719 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/880246d9-9662-47e8-a0ff-5d2aca6de029-kube-api-access-2zdpq" (OuterVolumeSpecName: "kube-api-access-2zdpq") pod "880246d9-9662-47e8-a0ff-5d2aca6de029" (UID: "880246d9-9662-47e8-a0ff-5d2aca6de029"). InnerVolumeSpecName "kube-api-access-2zdpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.468821 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.512389 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.532671 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-config\") pod \"5cb7920e-685c-4bb7-b276-3bf902251bd7\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.532833 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss4wh\" (UniqueName: \"kubernetes.io/projected/5cb7920e-685c-4bb7-b276-3bf902251bd7-kube-api-access-ss4wh\") pod \"5cb7920e-685c-4bb7-b276-3bf902251bd7\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.532880 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-dns-svc\") pod \"5cb7920e-685c-4bb7-b276-3bf902251bd7\" (UID: \"5cb7920e-685c-4bb7-b276-3bf902251bd7\") " Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.533231 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zdpq\" (UniqueName: \"kubernetes.io/projected/880246d9-9662-47e8-a0ff-5d2aca6de029-kube-api-access-2zdpq\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.533252 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.533262 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/880246d9-9662-47e8-a0ff-5d2aca6de029-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.533309 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-config" (OuterVolumeSpecName: "config") pod "5cb7920e-685c-4bb7-b276-3bf902251bd7" (UID: "5cb7920e-685c-4bb7-b276-3bf902251bd7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.534975 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5cb7920e-685c-4bb7-b276-3bf902251bd7" (UID: "5cb7920e-685c-4bb7-b276-3bf902251bd7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.544362 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cb7920e-685c-4bb7-b276-3bf902251bd7-kube-api-access-ss4wh" (OuterVolumeSpecName: "kube-api-access-ss4wh") pod "5cb7920e-685c-4bb7-b276-3bf902251bd7" (UID: "5cb7920e-685c-4bb7-b276-3bf902251bd7"). InnerVolumeSpecName "kube-api-access-ss4wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.635142 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss4wh\" (UniqueName: \"kubernetes.io/projected/5cb7920e-685c-4bb7-b276-3bf902251bd7-kube-api-access-ss4wh\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.635980 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.635994 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb7920e-685c-4bb7-b276-3bf902251bd7-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.713006 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-p9cs4"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.852305 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.852359 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.856326 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-sdlxp"] Feb 19 21:45:10 crc kubenswrapper[4795]: W0219 21:45:10.867612 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod267a3a62_4f3e_43c8_a1a8_8b47e9d17e80.slice/crio-f96b347d54ce26b56709e3b669b660d82336d8ce5d8f3ba525a2a9bc9a62104b WatchSource:0}: Error finding container f96b347d54ce26b56709e3b669b660d82336d8ce5d8f3ba525a2a9bc9a62104b: Status 404 returned error can't find the container with id f96b347d54ce26b56709e3b669b660d82336d8ce5d8f3ba525a2a9bc9a62104b Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.871946 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-74c7x" event={"ID":"5cb7920e-685c-4bb7-b276-3bf902251bd7","Type":"ContainerDied","Data":"d22d6d60f193bcb6a68565e5f85094654e0386ed18eb92971e096bc7c18b8ee4"} Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.872065 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-74c7x" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.876868 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tl5hf" event={"ID":"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3","Type":"ContainerStarted","Data":"ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf"} Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.876927 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.876952 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.879531 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" event={"ID":"880246d9-9662-47e8-a0ff-5d2aca6de029","Type":"ContainerDied","Data":"236c5910103a0e9c6c7b85c7e035781b7dc5db0e7378799d12ebd2dd79bce1b2"} Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.879596 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-9rmlj" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.882721 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p9cs4" event={"ID":"eee0ea5d-4b43-4421-b23e-555c5eac3564","Type":"ContainerStarted","Data":"eaba90113d6ff0b858d733af82b8a4a862659df0d41e63fdc645db66d9298341"} Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.883326 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.909194 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-tl5hf" podStartSLOduration=18.932413798 podStartE2EDuration="23.90915543s" podCreationTimestamp="2026-02-19 21:44:47 +0000 UTC" firstStartedPulling="2026-02-19 21:45:02.411065698 +0000 UTC m=+1013.603583562" lastFinishedPulling="2026-02-19 21:45:07.38780733 +0000 UTC m=+1018.580325194" observedRunningTime="2026-02-19 21:45:10.89571431 +0000 UTC m=+1022.088232174" watchObservedRunningTime="2026-02-19 21:45:10.90915543 +0000 UTC m=+1022.101673314" Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.934252 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-74c7x"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.948260 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-74c7x"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.961557 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-hk588"] Feb 19 21:45:10 crc kubenswrapper[4795]: W0219 21:45:10.973475 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9aa18e6a_5d0f_4f6e_b36c_3a2b9e2d0d24.slice/crio-30d488a86fb4ac5c67d31e2d6e5fe67d75ebbbcc1b14297f26623cb276be726d WatchSource:0}: Error finding container 30d488a86fb4ac5c67d31e2d6e5fe67d75ebbbcc1b14297f26623cb276be726d: Status 404 returned error can't find the container with id 30d488a86fb4ac5c67d31e2d6e5fe67d75ebbbcc1b14297f26623cb276be726d Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.977385 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-9rmlj"] Feb 19 21:45:10 crc kubenswrapper[4795]: I0219 21:45:10.981578 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-9rmlj"] Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.488385 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.536043 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cb7920e-685c-4bb7-b276-3bf902251bd7" path="/var/lib/kubelet/pods/5cb7920e-685c-4bb7-b276-3bf902251bd7/volumes" Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.536846 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="880246d9-9662-47e8-a0ff-5d2aca6de029" path="/var/lib/kubelet/pods/880246d9-9662-47e8-a0ff-5d2aca6de029/volumes" Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.544581 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.890665 4795 generic.go:334] "Generic (PLEG): container finished" podID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" containerID="7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6" exitCode=0 Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.890742 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" event={"ID":"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24","Type":"ContainerDied","Data":"7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6"} Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.890772 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" event={"ID":"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24","Type":"ContainerStarted","Data":"30d488a86fb4ac5c67d31e2d6e5fe67d75ebbbcc1b14297f26623cb276be726d"} Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.892719 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p9cs4" event={"ID":"eee0ea5d-4b43-4421-b23e-555c5eac3564","Type":"ContainerStarted","Data":"3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4"} Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.898705 4795 generic.go:334] "Generic (PLEG): container finished" podID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" containerID="7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad" exitCode=0 Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.900918 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" event={"ID":"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80","Type":"ContainerDied","Data":"7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad"} Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.901001 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" event={"ID":"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80","Type":"ContainerStarted","Data":"f96b347d54ce26b56709e3b669b660d82336d8ce5d8f3ba525a2a9bc9a62104b"} Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.902675 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 19 21:45:11 crc kubenswrapper[4795]: I0219 21:45:11.977502 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-p9cs4" podStartSLOduration=2.977486667 podStartE2EDuration="2.977486667s" podCreationTimestamp="2026-02-19 21:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:11.959859549 +0000 UTC m=+1023.152377433" watchObservedRunningTime="2026-02-19 21:45:11.977486667 +0000 UTC m=+1023.170004531" Feb 19 21:45:12 crc kubenswrapper[4795]: I0219 21:45:12.524185 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 19 21:45:12 crc kubenswrapper[4795]: I0219 21:45:12.910068 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" event={"ID":"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24","Type":"ContainerStarted","Data":"05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603"} Feb 19 21:45:12 crc kubenswrapper[4795]: I0219 21:45:12.910319 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:12 crc kubenswrapper[4795]: I0219 21:45:12.912461 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" event={"ID":"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80","Type":"ContainerStarted","Data":"66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130"} Feb 19 21:45:12 crc kubenswrapper[4795]: I0219 21:45:12.936439 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" podStartSLOduration=2.498844959 podStartE2EDuration="2.936411042s" podCreationTimestamp="2026-02-19 21:45:10 +0000 UTC" firstStartedPulling="2026-02-19 21:45:10.985344013 +0000 UTC m=+1022.177861877" lastFinishedPulling="2026-02-19 21:45:11.422910096 +0000 UTC m=+1022.615427960" observedRunningTime="2026-02-19 21:45:12.931513674 +0000 UTC m=+1024.124031548" watchObservedRunningTime="2026-02-19 21:45:12.936411042 +0000 UTC m=+1024.128928946" Feb 19 21:45:12 crc kubenswrapper[4795]: I0219 21:45:12.954177 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" podStartSLOduration=2.481070436 podStartE2EDuration="2.954144704s" podCreationTimestamp="2026-02-19 21:45:10 +0000 UTC" firstStartedPulling="2026-02-19 21:45:10.871346361 +0000 UTC m=+1022.063864225" lastFinishedPulling="2026-02-19 21:45:11.344420619 +0000 UTC m=+1022.536938493" observedRunningTime="2026-02-19 21:45:12.950723067 +0000 UTC m=+1024.143240941" watchObservedRunningTime="2026-02-19 21:45:12.954144704 +0000 UTC m=+1024.146662568" Feb 19 21:45:13 crc kubenswrapper[4795]: I0219 21:45:13.206878 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 21:45:13 crc kubenswrapper[4795]: I0219 21:45:13.378817 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 19 21:45:13 crc kubenswrapper[4795]: I0219 21:45:13.455371 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 19 21:45:13 crc kubenswrapper[4795]: I0219 21:45:13.917893 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.417204 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.607153 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.608333 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.610379 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-x48sf" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.610959 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.611068 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.611093 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.631051 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.697100 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-scripts\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.697156 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.697296 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxwxt\" (UniqueName: \"kubernetes.io/projected/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-kube-api-access-xxwxt\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.697338 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.697432 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-config\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.697459 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.697518 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.799069 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.799180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.799218 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-scripts\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.799256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.799322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxwxt\" (UniqueName: \"kubernetes.io/projected/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-kube-api-access-xxwxt\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.799352 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.799420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-config\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.800664 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.800724 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-scripts\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.801830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-config\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.805536 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.810783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.813017 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.820389 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxwxt\" (UniqueName: \"kubernetes.io/projected/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-kube-api-access-xxwxt\") pod \"ovn-northd-0\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " pod="openstack/ovn-northd-0" Feb 19 21:45:14 crc kubenswrapper[4795]: I0219 21:45:14.930443 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 21:45:15 crc kubenswrapper[4795]: I0219 21:45:15.385021 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:45:15 crc kubenswrapper[4795]: I0219 21:45:15.935292 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c","Type":"ContainerStarted","Data":"dbab531f1a8f22d58c44dcac6c6209fda329451de2d8664028adcfc876aa2507"} Feb 19 21:45:15 crc kubenswrapper[4795]: I0219 21:45:15.936686 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99","Type":"ContainerStarted","Data":"106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0"} Feb 19 21:45:16 crc kubenswrapper[4795]: I0219 21:45:16.101775 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 19 21:45:16 crc kubenswrapper[4795]: I0219 21:45:16.943722 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c","Type":"ContainerStarted","Data":"1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3"} Feb 19 21:45:16 crc kubenswrapper[4795]: I0219 21:45:16.944118 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c","Type":"ContainerStarted","Data":"2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f"} Feb 19 21:45:16 crc kubenswrapper[4795]: I0219 21:45:16.945272 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 19 21:45:16 crc kubenswrapper[4795]: I0219 21:45:16.964774 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.002435226 podStartE2EDuration="2.964750197s" podCreationTimestamp="2026-02-19 21:45:14 +0000 UTC" firstStartedPulling="2026-02-19 21:45:15.400831007 +0000 UTC m=+1026.593348871" lastFinishedPulling="2026-02-19 21:45:16.363145978 +0000 UTC m=+1027.555663842" observedRunningTime="2026-02-19 21:45:16.963017718 +0000 UTC m=+1028.155535582" watchObservedRunningTime="2026-02-19 21:45:16.964750197 +0000 UTC m=+1028.157268091" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.606240 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jlwf9"] Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.608849 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.611027 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.618196 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jlwf9"] Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.778017 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cchds\" (UniqueName: \"kubernetes.io/projected/11cc466e-0752-46b5-9775-c29748b13724-kube-api-access-cchds\") pod \"root-account-create-update-jlwf9\" (UID: \"11cc466e-0752-46b5-9775-c29748b13724\") " pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.778104 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11cc466e-0752-46b5-9775-c29748b13724-operator-scripts\") pod \"root-account-create-update-jlwf9\" (UID: \"11cc466e-0752-46b5-9775-c29748b13724\") " pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.879351 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cchds\" (UniqueName: \"kubernetes.io/projected/11cc466e-0752-46b5-9775-c29748b13724-kube-api-access-cchds\") pod \"root-account-create-update-jlwf9\" (UID: \"11cc466e-0752-46b5-9775-c29748b13724\") " pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.879463 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11cc466e-0752-46b5-9775-c29748b13724-operator-scripts\") pod \"root-account-create-update-jlwf9\" (UID: \"11cc466e-0752-46b5-9775-c29748b13724\") " pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.881005 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11cc466e-0752-46b5-9775-c29748b13724-operator-scripts\") pod \"root-account-create-update-jlwf9\" (UID: \"11cc466e-0752-46b5-9775-c29748b13724\") " pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.901203 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cchds\" (UniqueName: \"kubernetes.io/projected/11cc466e-0752-46b5-9775-c29748b13724-kube-api-access-cchds\") pod \"root-account-create-update-jlwf9\" (UID: \"11cc466e-0752-46b5-9775-c29748b13724\") " pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.955813 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.967220 4795 generic.go:334] "Generic (PLEG): container finished" podID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" containerID="106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0" exitCode=0 Feb 19 21:45:19 crc kubenswrapper[4795]: I0219 21:45:19.967261 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99","Type":"ContainerDied","Data":"106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0"} Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.365376 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.414453 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jlwf9"] Feb 19 21:45:20 crc kubenswrapper[4795]: W0219 21:45:20.420674 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11cc466e_0752_46b5_9775_c29748b13724.slice/crio-979235bd6eae8ecfcd6ef62b42508d6ffd09e3a3584da53da154d43decebd790 WatchSource:0}: Error finding container 979235bd6eae8ecfcd6ef62b42508d6ffd09e3a3584da53da154d43decebd790: Status 404 returned error can't find the container with id 979235bd6eae8ecfcd6ef62b42508d6ffd09e3a3584da53da154d43decebd790 Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.513875 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.577694 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-sdlxp"] Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.974649 4795 generic.go:334] "Generic (PLEG): container finished" podID="11cc466e-0752-46b5-9775-c29748b13724" containerID="0c92f3d0df6fb4ffa1967f7b15d462ab0538b07666cad5a1fe2cc6d118293fe8" exitCode=0 Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.974743 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jlwf9" event={"ID":"11cc466e-0752-46b5-9775-c29748b13724","Type":"ContainerDied","Data":"0c92f3d0df6fb4ffa1967f7b15d462ab0538b07666cad5a1fe2cc6d118293fe8"} Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.974772 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jlwf9" event={"ID":"11cc466e-0752-46b5-9775-c29748b13724","Type":"ContainerStarted","Data":"979235bd6eae8ecfcd6ef62b42508d6ffd09e3a3584da53da154d43decebd790"} Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.976595 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" podUID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" containerName="dnsmasq-dns" containerID="cri-o://66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130" gracePeriod=10 Feb 19 21:45:20 crc kubenswrapper[4795]: I0219 21:45:20.976663 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99","Type":"ContainerStarted","Data":"0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153"} Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.018940 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371992.835857 podStartE2EDuration="44.018918202s" podCreationTimestamp="2026-02-19 21:44:37 +0000 UTC" firstStartedPulling="2026-02-19 21:44:40.037786119 +0000 UTC m=+991.230303983" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:21.013576201 +0000 UTC m=+1032.206094085" watchObservedRunningTime="2026-02-19 21:45:21.018918202 +0000 UTC m=+1032.211436066" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.415822 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.507030 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-dns-svc\") pod \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.507134 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2p7v\" (UniqueName: \"kubernetes.io/projected/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-kube-api-access-n2p7v\") pod \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.507207 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-config\") pod \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.507285 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-ovsdbserver-nb\") pod \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\" (UID: \"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80\") " Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.520698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-kube-api-access-n2p7v" (OuterVolumeSpecName: "kube-api-access-n2p7v") pod "267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" (UID: "267a3a62-4f3e-43c8-a1a8-8b47e9d17e80"). InnerVolumeSpecName "kube-api-access-n2p7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.544587 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-config" (OuterVolumeSpecName: "config") pod "267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" (UID: "267a3a62-4f3e-43c8-a1a8-8b47e9d17e80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.548321 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" (UID: "267a3a62-4f3e-43c8-a1a8-8b47e9d17e80"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.552832 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" (UID: "267a3a62-4f3e-43c8-a1a8-8b47e9d17e80"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.609902 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2p7v\" (UniqueName: \"kubernetes.io/projected/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-kube-api-access-n2p7v\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.610284 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.610350 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.610380 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.984371 4795 generic.go:334] "Generic (PLEG): container finished" podID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" containerID="66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130" exitCode=0 Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.984422 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.984445 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" event={"ID":"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80","Type":"ContainerDied","Data":"66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130"} Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.984697 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bdd75c-sdlxp" event={"ID":"267a3a62-4f3e-43c8-a1a8-8b47e9d17e80","Type":"ContainerDied","Data":"f96b347d54ce26b56709e3b669b660d82336d8ce5d8f3ba525a2a9bc9a62104b"} Feb 19 21:45:21 crc kubenswrapper[4795]: I0219 21:45:21.984720 4795 scope.go:117] "RemoveContainer" containerID="66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.002606 4795 scope.go:117] "RemoveContainer" containerID="7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.018210 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-sdlxp"] Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.023644 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57bdd75c-sdlxp"] Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.042326 4795 scope.go:117] "RemoveContainer" containerID="66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130" Feb 19 21:45:22 crc kubenswrapper[4795]: E0219 21:45:22.044309 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130\": container with ID starting with 66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130 not found: ID does not exist" containerID="66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.044356 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130"} err="failed to get container status \"66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130\": rpc error: code = NotFound desc = could not find container \"66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130\": container with ID starting with 66417e96a24bb0cf529cc2cfe3ecba9c66996b31063cea3b0971bb91b892c130 not found: ID does not exist" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.044389 4795 scope.go:117] "RemoveContainer" containerID="7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad" Feb 19 21:45:22 crc kubenswrapper[4795]: E0219 21:45:22.044763 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad\": container with ID starting with 7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad not found: ID does not exist" containerID="7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.044800 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad"} err="failed to get container status \"7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad\": rpc error: code = NotFound desc = could not find container \"7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad\": container with ID starting with 7eff9819df5a88a947e58d27d572069ed1a99c521c3bc92a815ae47f9f49adad not found: ID does not exist" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.303966 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.421149 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11cc466e-0752-46b5-9775-c29748b13724-operator-scripts\") pod \"11cc466e-0752-46b5-9775-c29748b13724\" (UID: \"11cc466e-0752-46b5-9775-c29748b13724\") " Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.421532 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cchds\" (UniqueName: \"kubernetes.io/projected/11cc466e-0752-46b5-9775-c29748b13724-kube-api-access-cchds\") pod \"11cc466e-0752-46b5-9775-c29748b13724\" (UID: \"11cc466e-0752-46b5-9775-c29748b13724\") " Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.421706 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11cc466e-0752-46b5-9775-c29748b13724-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11cc466e-0752-46b5-9775-c29748b13724" (UID: "11cc466e-0752-46b5-9775-c29748b13724"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.429563 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11cc466e-0752-46b5-9775-c29748b13724-kube-api-access-cchds" (OuterVolumeSpecName: "kube-api-access-cchds") pod "11cc466e-0752-46b5-9775-c29748b13724" (UID: "11cc466e-0752-46b5-9775-c29748b13724"). InnerVolumeSpecName "kube-api-access-cchds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.523861 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11cc466e-0752-46b5-9775-c29748b13724-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:22 crc kubenswrapper[4795]: I0219 21:45:22.523908 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cchds\" (UniqueName: \"kubernetes.io/projected/11cc466e-0752-46b5-9775-c29748b13724-kube-api-access-cchds\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.000287 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jlwf9" event={"ID":"11cc466e-0752-46b5-9775-c29748b13724","Type":"ContainerDied","Data":"979235bd6eae8ecfcd6ef62b42508d6ffd09e3a3584da53da154d43decebd790"} Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.000356 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jlwf9" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.000365 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="979235bd6eae8ecfcd6ef62b42508d6ffd09e3a3584da53da154d43decebd790" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.152756 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-hpfn4"] Feb 19 21:45:23 crc kubenswrapper[4795]: E0219 21:45:23.153476 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" containerName="dnsmasq-dns" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.153501 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" containerName="dnsmasq-dns" Feb 19 21:45:23 crc kubenswrapper[4795]: E0219 21:45:23.153528 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" containerName="init" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.153536 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" containerName="init" Feb 19 21:45:23 crc kubenswrapper[4795]: E0219 21:45:23.153558 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cc466e-0752-46b5-9775-c29748b13724" containerName="mariadb-account-create-update" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.153567 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cc466e-0752-46b5-9775-c29748b13724" containerName="mariadb-account-create-update" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.153786 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" containerName="dnsmasq-dns" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.153807 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cc466e-0752-46b5-9775-c29748b13724" containerName="mariadb-account-create-update" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.154811 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.208508 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-hpfn4"] Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.234932 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-config\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.235268 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-dns-svc\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.235404 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpq8v\" (UniqueName: \"kubernetes.io/projected/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-kube-api-access-cpq8v\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.235564 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.235689 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.337125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-config\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.337411 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-dns-svc\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.337546 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpq8v\" (UniqueName: \"kubernetes.io/projected/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-kube-api-access-cpq8v\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.337692 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.337815 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.338028 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-config\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.338368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-dns-svc\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.338652 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.338729 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.360026 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpq8v\" (UniqueName: \"kubernetes.io/projected/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-kube-api-access-cpq8v\") pod \"dnsmasq-dns-689df5d84f-hpfn4\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.495258 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.525262 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="267a3a62-4f3e-43c8-a1a8-8b47e9d17e80" path="/var/lib/kubelet/pods/267a3a62-4f3e-43c8-a1a8-8b47e9d17e80/volumes" Feb 19 21:45:23 crc kubenswrapper[4795]: I0219 21:45:23.926888 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-hpfn4"] Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.009175 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" event={"ID":"7b4dab90-2dba-4e1f-95fe-5c435d4e270a","Type":"ContainerStarted","Data":"6a5cc19b4b04424e6441590aca0708de64c3cb94b9b2a21745480c88aa7a5c4f"} Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.267017 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.271758 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.273913 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.274294 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.281104 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-slsb4" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.281120 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.296567 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.351780 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-cache\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.351827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bh4z\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-kube-api-access-7bh4z\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.351879 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c773ec2-a400-42a9-8784-ed9c295c3bb4-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.351908 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.351927 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-lock\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.351961 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.453544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-cache\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.453604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bh4z\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-kube-api-access-7bh4z\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.453653 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c773ec2-a400-42a9-8784-ed9c295c3bb4-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.453686 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.453703 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-lock\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.453744 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: E0219 21:45:24.454097 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 21:45:24 crc kubenswrapper[4795]: E0219 21:45:24.454217 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 21:45:24 crc kubenswrapper[4795]: E0219 21:45:24.454319 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift podName:6c773ec2-a400-42a9-8784-ed9c295c3bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:45:24.954299582 +0000 UTC m=+1036.146817446 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift") pod "swift-storage-0" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4") : configmap "swift-ring-files" not found Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.454371 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-lock\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.454108 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.454136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-cache\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.473401 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c773ec2-a400-42a9-8784-ed9c295c3bb4-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.483508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bh4z\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-kube-api-access-7bh4z\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.489450 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.737052 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-xmbg2"] Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.738228 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.739821 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.740672 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.741874 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.756586 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xmbg2"] Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.863232 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-scripts\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.863368 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svtdp\" (UniqueName: \"kubernetes.io/projected/f8945f31-b1d9-4c65-9f8c-2619f87d4237-kube-api-access-svtdp\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.863462 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-dispersionconf\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.863566 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-ring-data-devices\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.863672 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f8945f31-b1d9-4c65-9f8c-2619f87d4237-etc-swift\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.863710 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-swiftconf\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.863760 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-combined-ca-bundle\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.964724 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svtdp\" (UniqueName: \"kubernetes.io/projected/f8945f31-b1d9-4c65-9f8c-2619f87d4237-kube-api-access-svtdp\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.965005 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.965116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-dispersionconf\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.965221 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-ring-data-devices\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.965318 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f8945f31-b1d9-4c65-9f8c-2619f87d4237-etc-swift\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.965393 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-swiftconf\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.965473 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-combined-ca-bundle\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.965568 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-scripts\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.965936 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f8945f31-b1d9-4c65-9f8c-2619f87d4237-etc-swift\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: E0219 21:45:24.965206 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 21:45:24 crc kubenswrapper[4795]: E0219 21:45:24.965992 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 21:45:24 crc kubenswrapper[4795]: E0219 21:45:24.966032 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift podName:6c773ec2-a400-42a9-8784-ed9c295c3bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:45:25.96601713 +0000 UTC m=+1037.158534994 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift") pod "swift-storage-0" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4") : configmap "swift-ring-files" not found Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.966496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-scripts\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.966646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-ring-data-devices\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.969903 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-dispersionconf\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.970497 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-combined-ca-bundle\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.970790 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-swiftconf\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:24 crc kubenswrapper[4795]: I0219 21:45:24.985068 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svtdp\" (UniqueName: \"kubernetes.io/projected/f8945f31-b1d9-4c65-9f8c-2619f87d4237-kube-api-access-svtdp\") pod \"swift-ring-rebalance-xmbg2\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:25 crc kubenswrapper[4795]: I0219 21:45:25.017586 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" containerID="c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced" exitCode=0 Feb 19 21:45:25 crc kubenswrapper[4795]: I0219 21:45:25.017701 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" event={"ID":"7b4dab90-2dba-4e1f-95fe-5c435d4e270a","Type":"ContainerDied","Data":"c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced"} Feb 19 21:45:25 crc kubenswrapper[4795]: I0219 21:45:25.054039 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:25 crc kubenswrapper[4795]: I0219 21:45:25.522655 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xmbg2"] Feb 19 21:45:25 crc kubenswrapper[4795]: W0219 21:45:25.537151 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8945f31_b1d9_4c65_9f8c_2619f87d4237.slice/crio-ca2816f59591a52db0fc74a4911b062bb5362403053e13a03adbbd1512ca5f93 WatchSource:0}: Error finding container ca2816f59591a52db0fc74a4911b062bb5362403053e13a03adbbd1512ca5f93: Status 404 returned error can't find the container with id ca2816f59591a52db0fc74a4911b062bb5362403053e13a03adbbd1512ca5f93 Feb 19 21:45:25 crc kubenswrapper[4795]: I0219 21:45:25.985758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:25 crc kubenswrapper[4795]: E0219 21:45:25.985940 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 21:45:25 crc kubenswrapper[4795]: E0219 21:45:25.985961 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 21:45:25 crc kubenswrapper[4795]: E0219 21:45:25.986013 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift podName:6c773ec2-a400-42a9-8784-ed9c295c3bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:45:27.985997614 +0000 UTC m=+1039.178515478 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift") pod "swift-storage-0" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4") : configmap "swift-ring-files" not found Feb 19 21:45:26 crc kubenswrapper[4795]: I0219 21:45:26.029231 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" event={"ID":"7b4dab90-2dba-4e1f-95fe-5c435d4e270a","Type":"ContainerStarted","Data":"5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86"} Feb 19 21:45:26 crc kubenswrapper[4795]: I0219 21:45:26.029385 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:26 crc kubenswrapper[4795]: I0219 21:45:26.030603 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xmbg2" event={"ID":"f8945f31-b1d9-4c65-9f8c-2619f87d4237","Type":"ContainerStarted","Data":"ca2816f59591a52db0fc74a4911b062bb5362403053e13a03adbbd1512ca5f93"} Feb 19 21:45:28 crc kubenswrapper[4795]: I0219 21:45:28.026231 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:28 crc kubenswrapper[4795]: E0219 21:45:28.026513 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 21:45:28 crc kubenswrapper[4795]: E0219 21:45:28.027748 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 21:45:28 crc kubenswrapper[4795]: E0219 21:45:28.027832 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift podName:6c773ec2-a400-42a9-8784-ed9c295c3bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:45:32.027805159 +0000 UTC m=+1043.220323043 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift") pod "swift-storage-0" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4") : configmap "swift-ring-files" not found Feb 19 21:45:29 crc kubenswrapper[4795]: I0219 21:45:29.330731 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 19 21:45:29 crc kubenswrapper[4795]: I0219 21:45:29.331074 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 19 21:45:29 crc kubenswrapper[4795]: I0219 21:45:29.409933 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 19 21:45:29 crc kubenswrapper[4795]: I0219 21:45:29.433833 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" podStartSLOduration=6.433813901 podStartE2EDuration="6.433813901s" podCreationTimestamp="2026-02-19 21:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:26.046965794 +0000 UTC m=+1037.239483658" watchObservedRunningTime="2026-02-19 21:45:29.433813901 +0000 UTC m=+1040.626331765" Feb 19 21:45:30 crc kubenswrapper[4795]: I0219 21:45:30.126825 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 19 21:45:31 crc kubenswrapper[4795]: I0219 21:45:31.067460 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xmbg2" event={"ID":"f8945f31-b1d9-4c65-9f8c-2619f87d4237","Type":"ContainerStarted","Data":"2c21dc2092bc9760ef50273deb040b6251a319d32c5e2c0fdd3bf1678ba55094"} Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.099643 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:32 crc kubenswrapper[4795]: E0219 21:45:32.099910 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 21:45:32 crc kubenswrapper[4795]: E0219 21:45:32.099925 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 21:45:32 crc kubenswrapper[4795]: E0219 21:45:32.099966 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift podName:6c773ec2-a400-42a9-8784-ed9c295c3bb4 nodeName:}" failed. No retries permitted until 2026-02-19 21:45:40.099952573 +0000 UTC m=+1051.292470427 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift") pod "swift-storage-0" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4") : configmap "swift-ring-files" not found Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.107141 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-xmbg2" podStartSLOduration=3.5614602250000003 podStartE2EDuration="8.107121795s" podCreationTimestamp="2026-02-19 21:45:24 +0000 UTC" firstStartedPulling="2026-02-19 21:45:25.541327133 +0000 UTC m=+1036.733844997" lastFinishedPulling="2026-02-19 21:45:30.086988703 +0000 UTC m=+1041.279506567" observedRunningTime="2026-02-19 21:45:31.096054531 +0000 UTC m=+1042.288572395" watchObservedRunningTime="2026-02-19 21:45:32.107121795 +0000 UTC m=+1043.299639659" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.110280 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-q8t82"] Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.112040 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.120854 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-q8t82"] Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.201236 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65449c45-b8f9-445e-80e7-6e3c8541c62c-operator-scripts\") pod \"keystone-db-create-q8t82\" (UID: \"65449c45-b8f9-445e-80e7-6e3c8541c62c\") " pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.201617 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tchxf\" (UniqueName: \"kubernetes.io/projected/65449c45-b8f9-445e-80e7-6e3c8541c62c-kube-api-access-tchxf\") pod \"keystone-db-create-q8t82\" (UID: \"65449c45-b8f9-445e-80e7-6e3c8541c62c\") " pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.225621 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-41fb-account-create-update-ntc9w"] Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.226635 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.233755 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.239153 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-41fb-account-create-update-ntc9w"] Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.304562 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tchxf\" (UniqueName: \"kubernetes.io/projected/65449c45-b8f9-445e-80e7-6e3c8541c62c-kube-api-access-tchxf\") pod \"keystone-db-create-q8t82\" (UID: \"65449c45-b8f9-445e-80e7-6e3c8541c62c\") " pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.304666 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65449c45-b8f9-445e-80e7-6e3c8541c62c-operator-scripts\") pod \"keystone-db-create-q8t82\" (UID: \"65449c45-b8f9-445e-80e7-6e3c8541c62c\") " pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.304700 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-969df\" (UniqueName: \"kubernetes.io/projected/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-kube-api-access-969df\") pod \"keystone-41fb-account-create-update-ntc9w\" (UID: \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\") " pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.304757 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-operator-scripts\") pod \"keystone-41fb-account-create-update-ntc9w\" (UID: \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\") " pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.305472 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65449c45-b8f9-445e-80e7-6e3c8541c62c-operator-scripts\") pod \"keystone-db-create-q8t82\" (UID: \"65449c45-b8f9-445e-80e7-6e3c8541c62c\") " pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.317899 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-lqp9l"] Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.318882 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.329297 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-lqp9l"] Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.329948 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tchxf\" (UniqueName: \"kubernetes.io/projected/65449c45-b8f9-445e-80e7-6e3c8541c62c-kube-api-access-tchxf\") pod \"keystone-db-create-q8t82\" (UID: \"65449c45-b8f9-445e-80e7-6e3c8541c62c\") " pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.406378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-969df\" (UniqueName: \"kubernetes.io/projected/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-kube-api-access-969df\") pod \"keystone-41fb-account-create-update-ntc9w\" (UID: \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\") " pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.406439 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-operator-scripts\") pod \"keystone-41fb-account-create-update-ntc9w\" (UID: \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\") " pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.406465 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-operator-scripts\") pod \"placement-db-create-lqp9l\" (UID: \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\") " pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.406497 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hmkw\" (UniqueName: \"kubernetes.io/projected/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-kube-api-access-4hmkw\") pod \"placement-db-create-lqp9l\" (UID: \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\") " pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.407562 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-operator-scripts\") pod \"keystone-41fb-account-create-update-ntc9w\" (UID: \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\") " pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.408893 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c741-account-create-update-hdlzx"] Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.410158 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.411985 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.419632 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c741-account-create-update-hdlzx"] Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.423925 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-969df\" (UniqueName: \"kubernetes.io/projected/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-kube-api-access-969df\") pod \"keystone-41fb-account-create-update-ntc9w\" (UID: \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\") " pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.498128 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.516537 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf4dk\" (UniqueName: \"kubernetes.io/projected/541fd524-94f2-4149-b16b-ab11a716ff95-kube-api-access-xf4dk\") pod \"placement-c741-account-create-update-hdlzx\" (UID: \"541fd524-94f2-4149-b16b-ab11a716ff95\") " pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.516944 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-operator-scripts\") pod \"placement-db-create-lqp9l\" (UID: \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\") " pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.516976 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/541fd524-94f2-4149-b16b-ab11a716ff95-operator-scripts\") pod \"placement-c741-account-create-update-hdlzx\" (UID: \"541fd524-94f2-4149-b16b-ab11a716ff95\") " pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.517020 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hmkw\" (UniqueName: \"kubernetes.io/projected/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-kube-api-access-4hmkw\") pod \"placement-db-create-lqp9l\" (UID: \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\") " pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.518225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-operator-scripts\") pod \"placement-db-create-lqp9l\" (UID: \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\") " pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.534133 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hmkw\" (UniqueName: \"kubernetes.io/projected/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-kube-api-access-4hmkw\") pod \"placement-db-create-lqp9l\" (UID: \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\") " pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.541552 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.618062 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf4dk\" (UniqueName: \"kubernetes.io/projected/541fd524-94f2-4149-b16b-ab11a716ff95-kube-api-access-xf4dk\") pod \"placement-c741-account-create-update-hdlzx\" (UID: \"541fd524-94f2-4149-b16b-ab11a716ff95\") " pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.618179 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/541fd524-94f2-4149-b16b-ab11a716ff95-operator-scripts\") pod \"placement-c741-account-create-update-hdlzx\" (UID: \"541fd524-94f2-4149-b16b-ab11a716ff95\") " pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.619575 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/541fd524-94f2-4149-b16b-ab11a716ff95-operator-scripts\") pod \"placement-c741-account-create-update-hdlzx\" (UID: \"541fd524-94f2-4149-b16b-ab11a716ff95\") " pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.644742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf4dk\" (UniqueName: \"kubernetes.io/projected/541fd524-94f2-4149-b16b-ab11a716ff95-kube-api-access-xf4dk\") pod \"placement-c741-account-create-update-hdlzx\" (UID: \"541fd524-94f2-4149-b16b-ab11a716ff95\") " pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.661820 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.755544 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:32 crc kubenswrapper[4795]: I0219 21:45:32.949472 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-q8t82"] Feb 19 21:45:33 crc kubenswrapper[4795]: W0219 21:45:33.055895 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb1c9562_0143_4fa4_86d3_f1ed93f3fa31.slice/crio-4383db569460ac992c685e32755baabb7fa4619c28b9fce54927b757c2667447 WatchSource:0}: Error finding container 4383db569460ac992c685e32755baabb7fa4619c28b9fce54927b757c2667447: Status 404 returned error can't find the container with id 4383db569460ac992c685e32755baabb7fa4619c28b9fce54927b757c2667447 Feb 19 21:45:33 crc kubenswrapper[4795]: I0219 21:45:33.058044 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-41fb-account-create-update-ntc9w"] Feb 19 21:45:33 crc kubenswrapper[4795]: I0219 21:45:33.089317 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-41fb-account-create-update-ntc9w" event={"ID":"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31","Type":"ContainerStarted","Data":"4383db569460ac992c685e32755baabb7fa4619c28b9fce54927b757c2667447"} Feb 19 21:45:33 crc kubenswrapper[4795]: I0219 21:45:33.090552 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q8t82" event={"ID":"65449c45-b8f9-445e-80e7-6e3c8541c62c","Type":"ContainerStarted","Data":"867f54bd3a17809eff72b8c0887ca1d6cd2443800eed44f06f2ec6266a03d5ea"} Feb 19 21:45:33 crc kubenswrapper[4795]: I0219 21:45:33.139881 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-lqp9l"] Feb 19 21:45:33 crc kubenswrapper[4795]: I0219 21:45:33.236339 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c741-account-create-update-hdlzx"] Feb 19 21:45:33 crc kubenswrapper[4795]: W0219 21:45:33.240563 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod541fd524_94f2_4149_b16b_ab11a716ff95.slice/crio-3e05bcd97e2ba2640c0bab9eed29f375dc6d5ddbeb2df079a97ff158c1bcdbf5 WatchSource:0}: Error finding container 3e05bcd97e2ba2640c0bab9eed29f375dc6d5ddbeb2df079a97ff158c1bcdbf5: Status 404 returned error can't find the container with id 3e05bcd97e2ba2640c0bab9eed29f375dc6d5ddbeb2df079a97ff158c1bcdbf5 Feb 19 21:45:33 crc kubenswrapper[4795]: I0219 21:45:33.498355 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:45:33 crc kubenswrapper[4795]: I0219 21:45:33.555279 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-hk588"] Feb 19 21:45:33 crc kubenswrapper[4795]: I0219 21:45:33.555537 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" podUID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" containerName="dnsmasq-dns" containerID="cri-o://05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603" gracePeriod=10 Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.098370 4795 generic.go:334] "Generic (PLEG): container finished" podID="541fd524-94f2-4149-b16b-ab11a716ff95" containerID="95c63f1640980c95f161952724dcddb5ac630545c512a2aa1ea3882e47e48df9" exitCode=0 Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.098579 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c741-account-create-update-hdlzx" event={"ID":"541fd524-94f2-4149-b16b-ab11a716ff95","Type":"ContainerDied","Data":"95c63f1640980c95f161952724dcddb5ac630545c512a2aa1ea3882e47e48df9"} Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.098602 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c741-account-create-update-hdlzx" event={"ID":"541fd524-94f2-4149-b16b-ab11a716ff95","Type":"ContainerStarted","Data":"3e05bcd97e2ba2640c0bab9eed29f375dc6d5ddbeb2df079a97ff158c1bcdbf5"} Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.101630 4795 generic.go:334] "Generic (PLEG): container finished" podID="ddebb2b6-7bc0-45af-ba68-ae108b0d91fd" containerID="c3d76989d78df6e0877ea687626ca27c833a6485113f8ba0b36de864c9998267" exitCode=0 Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.101786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lqp9l" event={"ID":"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd","Type":"ContainerDied","Data":"c3d76989d78df6e0877ea687626ca27c833a6485113f8ba0b36de864c9998267"} Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.101811 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lqp9l" event={"ID":"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd","Type":"ContainerStarted","Data":"4dddb577653131d07177fdafb62bca6d10cf85c3012b351bdb9691eeb4630158"} Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.105356 4795 generic.go:334] "Generic (PLEG): container finished" podID="fb1c9562-0143-4fa4-86d3-f1ed93f3fa31" containerID="83bc6dd6efe6f9962ae47d6fb7eb26b3205715ecf2651537919f42966aa1ce32" exitCode=0 Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.105407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-41fb-account-create-update-ntc9w" event={"ID":"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31","Type":"ContainerDied","Data":"83bc6dd6efe6f9962ae47d6fb7eb26b3205715ecf2651537919f42966aa1ce32"} Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.106898 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.107048 4795 generic.go:334] "Generic (PLEG): container finished" podID="65449c45-b8f9-445e-80e7-6e3c8541c62c" containerID="d90138519ffcbc0102f0a7e9dc5bfa3f8f09f718ea4e3fd3a5f7e537cfb53122" exitCode=0 Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.107110 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q8t82" event={"ID":"65449c45-b8f9-445e-80e7-6e3c8541c62c","Type":"ContainerDied","Data":"d90138519ffcbc0102f0a7e9dc5bfa3f8f09f718ea4e3fd3a5f7e537cfb53122"} Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.108527 4795 generic.go:334] "Generic (PLEG): container finished" podID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" containerID="05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603" exitCode=0 Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.108552 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" event={"ID":"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24","Type":"ContainerDied","Data":"05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603"} Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.108567 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" event={"ID":"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24","Type":"ContainerDied","Data":"30d488a86fb4ac5c67d31e2d6e5fe67d75ebbbcc1b14297f26623cb276be726d"} Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.108582 4795 scope.go:117] "RemoveContainer" containerID="05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.108656 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-hk588" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.130701 4795 scope.go:117] "RemoveContainer" containerID="7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.161553 4795 scope.go:117] "RemoveContainer" containerID="05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603" Feb 19 21:45:34 crc kubenswrapper[4795]: E0219 21:45:34.164674 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603\": container with ID starting with 05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603 not found: ID does not exist" containerID="05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.164711 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603"} err="failed to get container status \"05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603\": rpc error: code = NotFound desc = could not find container \"05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603\": container with ID starting with 05ca792cce2c1842654ca399f54a44b1f3a2e077f6b7464bb9380e6c32ea3603 not found: ID does not exist" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.164732 4795 scope.go:117] "RemoveContainer" containerID="7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6" Feb 19 21:45:34 crc kubenswrapper[4795]: E0219 21:45:34.166360 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6\": container with ID starting with 7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6 not found: ID does not exist" containerID="7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.166385 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6"} err="failed to get container status \"7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6\": rpc error: code = NotFound desc = could not find container \"7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6\": container with ID starting with 7dadad1b3dff5c98d556b48e60505152135bae2b96f577c3d9bca2b9ab991bb6 not found: ID does not exist" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.255999 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-config\") pod \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.256147 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xddmw\" (UniqueName: \"kubernetes.io/projected/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-kube-api-access-xddmw\") pod \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.256198 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb\") pod \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.256257 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-nb\") pod \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.256274 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc\") pod \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.261670 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-kube-api-access-xddmw" (OuterVolumeSpecName: "kube-api-access-xddmw") pod "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" (UID: "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24"). InnerVolumeSpecName "kube-api-access-xddmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:34 crc kubenswrapper[4795]: E0219 21:45:34.300680 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc podName:9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24 nodeName:}" failed. No retries permitted until 2026-02-19 21:45:34.800654069 +0000 UTC m=+1045.993171933 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc") pod "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" (UID: "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24") : error deleting /var/lib/kubelet/pods/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24/volume-subpaths: remove /var/lib/kubelet/pods/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24/volume-subpaths: no such file or directory Feb 19 21:45:34 crc kubenswrapper[4795]: E0219 21:45:34.300792 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb podName:9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24 nodeName:}" failed. No retries permitted until 2026-02-19 21:45:34.800783413 +0000 UTC m=+1045.993301287 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-sb" (UniqueName: "kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb") pod "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" (UID: "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24") : error deleting /var/lib/kubelet/pods/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24/volume-subpaths: remove /var/lib/kubelet/pods/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24/volume-subpaths: no such file or directory Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.300979 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" (UID: "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.301045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-config" (OuterVolumeSpecName: "config") pod "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" (UID: "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.357709 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xddmw\" (UniqueName: \"kubernetes.io/projected/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-kube-api-access-xddmw\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.357920 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.357983 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.864362 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb\") pod \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.864784 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc\") pod \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\" (UID: \"9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24\") " Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.865386 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" (UID: "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.865572 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" (UID: "9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.967328 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.967374 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:34 crc kubenswrapper[4795]: I0219 21:45:34.996146 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.072388 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-hk588"] Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.075483 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-hk588"] Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.120275 4795 generic.go:334] "Generic (PLEG): container finished" podID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerID="f5dc53fcd687359370a9224413921410e03027d27bb1e741143948af4422db6c" exitCode=0 Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.120426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d","Type":"ContainerDied","Data":"f5dc53fcd687359370a9224413921410e03027d27bb1e741143948af4422db6c"} Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.556116 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" path="/var/lib/kubelet/pods/9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24/volumes" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.579121 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.689730 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf4dk\" (UniqueName: \"kubernetes.io/projected/541fd524-94f2-4149-b16b-ab11a716ff95-kube-api-access-xf4dk\") pod \"541fd524-94f2-4149-b16b-ab11a716ff95\" (UID: \"541fd524-94f2-4149-b16b-ab11a716ff95\") " Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.689784 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/541fd524-94f2-4149-b16b-ab11a716ff95-operator-scripts\") pod \"541fd524-94f2-4149-b16b-ab11a716ff95\" (UID: \"541fd524-94f2-4149-b16b-ab11a716ff95\") " Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.690736 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/541fd524-94f2-4149-b16b-ab11a716ff95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "541fd524-94f2-4149-b16b-ab11a716ff95" (UID: "541fd524-94f2-4149-b16b-ab11a716ff95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.704008 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/541fd524-94f2-4149-b16b-ab11a716ff95-kube-api-access-xf4dk" (OuterVolumeSpecName: "kube-api-access-xf4dk") pod "541fd524-94f2-4149-b16b-ab11a716ff95" (UID: "541fd524-94f2-4149-b16b-ab11a716ff95"). InnerVolumeSpecName "kube-api-access-xf4dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.787544 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.791669 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf4dk\" (UniqueName: \"kubernetes.io/projected/541fd524-94f2-4149-b16b-ab11a716ff95-kube-api-access-xf4dk\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.791700 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/541fd524-94f2-4149-b16b-ab11a716ff95-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.794559 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.799468 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.892774 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-969df\" (UniqueName: \"kubernetes.io/projected/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-kube-api-access-969df\") pod \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\" (UID: \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\") " Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.892846 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65449c45-b8f9-445e-80e7-6e3c8541c62c-operator-scripts\") pod \"65449c45-b8f9-445e-80e7-6e3c8541c62c\" (UID: \"65449c45-b8f9-445e-80e7-6e3c8541c62c\") " Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.892874 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-operator-scripts\") pod \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\" (UID: \"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31\") " Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.892939 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-operator-scripts\") pod \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\" (UID: \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\") " Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.892957 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hmkw\" (UniqueName: \"kubernetes.io/projected/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-kube-api-access-4hmkw\") pod \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\" (UID: \"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd\") " Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.892977 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tchxf\" (UniqueName: \"kubernetes.io/projected/65449c45-b8f9-445e-80e7-6e3c8541c62c-kube-api-access-tchxf\") pod \"65449c45-b8f9-445e-80e7-6e3c8541c62c\" (UID: \"65449c45-b8f9-445e-80e7-6e3c8541c62c\") " Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.893648 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ddebb2b6-7bc0-45af-ba68-ae108b0d91fd" (UID: "ddebb2b6-7bc0-45af-ba68-ae108b0d91fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.893647 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65449c45-b8f9-445e-80e7-6e3c8541c62c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65449c45-b8f9-445e-80e7-6e3c8541c62c" (UID: "65449c45-b8f9-445e-80e7-6e3c8541c62c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.893647 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb1c9562-0143-4fa4-86d3-f1ed93f3fa31" (UID: "fb1c9562-0143-4fa4-86d3-f1ed93f3fa31"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.897691 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-kube-api-access-969df" (OuterVolumeSpecName: "kube-api-access-969df") pod "fb1c9562-0143-4fa4-86d3-f1ed93f3fa31" (UID: "fb1c9562-0143-4fa4-86d3-f1ed93f3fa31"). InnerVolumeSpecName "kube-api-access-969df". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.897834 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-kube-api-access-4hmkw" (OuterVolumeSpecName: "kube-api-access-4hmkw") pod "ddebb2b6-7bc0-45af-ba68-ae108b0d91fd" (UID: "ddebb2b6-7bc0-45af-ba68-ae108b0d91fd"). InnerVolumeSpecName "kube-api-access-4hmkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.898601 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65449c45-b8f9-445e-80e7-6e3c8541c62c-kube-api-access-tchxf" (OuterVolumeSpecName: "kube-api-access-tchxf") pod "65449c45-b8f9-445e-80e7-6e3c8541c62c" (UID: "65449c45-b8f9-445e-80e7-6e3c8541c62c"). InnerVolumeSpecName "kube-api-access-tchxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.994507 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65449c45-b8f9-445e-80e7-6e3c8541c62c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.994536 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.994546 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.994555 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hmkw\" (UniqueName: \"kubernetes.io/projected/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd-kube-api-access-4hmkw\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.994564 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tchxf\" (UniqueName: \"kubernetes.io/projected/65449c45-b8f9-445e-80e7-6e3c8541c62c-kube-api-access-tchxf\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:35 crc kubenswrapper[4795]: I0219 21:45:35.994574 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-969df\" (UniqueName: \"kubernetes.io/projected/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31-kube-api-access-969df\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.128345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lqp9l" event={"ID":"ddebb2b6-7bc0-45af-ba68-ae108b0d91fd","Type":"ContainerDied","Data":"4dddb577653131d07177fdafb62bca6d10cf85c3012b351bdb9691eeb4630158"} Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.128681 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dddb577653131d07177fdafb62bca6d10cf85c3012b351bdb9691eeb4630158" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.128742 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lqp9l" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.141691 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41fb-account-create-update-ntc9w" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.141904 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-41fb-account-create-update-ntc9w" event={"ID":"fb1c9562-0143-4fa4-86d3-f1ed93f3fa31","Type":"ContainerDied","Data":"4383db569460ac992c685e32755baabb7fa4619c28b9fce54927b757c2667447"} Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.141938 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4383db569460ac992c685e32755baabb7fa4619c28b9fce54927b757c2667447" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.144435 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8t82" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.144439 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q8t82" event={"ID":"65449c45-b8f9-445e-80e7-6e3c8541c62c","Type":"ContainerDied","Data":"867f54bd3a17809eff72b8c0887ca1d6cd2443800eed44f06f2ec6266a03d5ea"} Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.144555 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="867f54bd3a17809eff72b8c0887ca1d6cd2443800eed44f06f2ec6266a03d5ea" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.147001 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d","Type":"ContainerStarted","Data":"5a6b19520891e7087129c9dfe002592444a956d213365be36d88dc721e7adc6e"} Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.147571 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.148800 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c741-account-create-update-hdlzx" event={"ID":"541fd524-94f2-4149-b16b-ab11a716ff95","Type":"ContainerDied","Data":"3e05bcd97e2ba2640c0bab9eed29f375dc6d5ddbeb2df079a97ff158c1bcdbf5"} Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.148827 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e05bcd97e2ba2640c0bab9eed29f375dc6d5ddbeb2df079a97ff158c1bcdbf5" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.148858 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c741-account-create-update-hdlzx" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.187414 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.154715879 podStartE2EDuration="1m0.18739142s" podCreationTimestamp="2026-02-19 21:44:36 +0000 UTC" firstStartedPulling="2026-02-19 21:44:38.460652946 +0000 UTC m=+989.653170810" lastFinishedPulling="2026-02-19 21:45:01.493328487 +0000 UTC m=+1012.685846351" observedRunningTime="2026-02-19 21:45:36.180941408 +0000 UTC m=+1047.373459282" watchObservedRunningTime="2026-02-19 21:45:36.18739142 +0000 UTC m=+1047.379909284" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.249094 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-7ktnd"] Feb 19 21:45:36 crc kubenswrapper[4795]: E0219 21:45:36.250220 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" containerName="init" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250244 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" containerName="init" Feb 19 21:45:36 crc kubenswrapper[4795]: E0219 21:45:36.250268 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1c9562-0143-4fa4-86d3-f1ed93f3fa31" containerName="mariadb-account-create-update" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250277 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1c9562-0143-4fa4-86d3-f1ed93f3fa31" containerName="mariadb-account-create-update" Feb 19 21:45:36 crc kubenswrapper[4795]: E0219 21:45:36.250291 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541fd524-94f2-4149-b16b-ab11a716ff95" containerName="mariadb-account-create-update" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250300 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="541fd524-94f2-4149-b16b-ab11a716ff95" containerName="mariadb-account-create-update" Feb 19 21:45:36 crc kubenswrapper[4795]: E0219 21:45:36.250310 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" containerName="dnsmasq-dns" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250318 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" containerName="dnsmasq-dns" Feb 19 21:45:36 crc kubenswrapper[4795]: E0219 21:45:36.250350 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65449c45-b8f9-445e-80e7-6e3c8541c62c" containerName="mariadb-database-create" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250358 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="65449c45-b8f9-445e-80e7-6e3c8541c62c" containerName="mariadb-database-create" Feb 19 21:45:36 crc kubenswrapper[4795]: E0219 21:45:36.250373 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddebb2b6-7bc0-45af-ba68-ae108b0d91fd" containerName="mariadb-database-create" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250380 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddebb2b6-7bc0-45af-ba68-ae108b0d91fd" containerName="mariadb-database-create" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250588 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddebb2b6-7bc0-45af-ba68-ae108b0d91fd" containerName="mariadb-database-create" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250607 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="65449c45-b8f9-445e-80e7-6e3c8541c62c" containerName="mariadb-database-create" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250620 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="541fd524-94f2-4149-b16b-ab11a716ff95" containerName="mariadb-account-create-update" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250637 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa18e6a-5d0f-4f6e-b36c-3a2b9e2d0d24" containerName="dnsmasq-dns" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.250651 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1c9562-0143-4fa4-86d3-f1ed93f3fa31" containerName="mariadb-account-create-update" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.251251 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.265662 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7ktnd"] Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.348649 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f769-account-create-update-25m5x"] Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.349625 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.366638 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.372387 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f769-account-create-update-25m5x"] Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.406148 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c0e289-4e3b-4b5a-93db-d38621a870ec-operator-scripts\") pod \"glance-db-create-7ktnd\" (UID: \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\") " pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.406266 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4wn5\" (UniqueName: \"kubernetes.io/projected/a2c0e289-4e3b-4b5a-93db-d38621a870ec-kube-api-access-t4wn5\") pod \"glance-db-create-7ktnd\" (UID: \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\") " pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.510000 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c0e289-4e3b-4b5a-93db-d38621a870ec-operator-scripts\") pod \"glance-db-create-7ktnd\" (UID: \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\") " pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.510057 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlbk2\" (UniqueName: \"kubernetes.io/projected/890a044b-0060-4feb-866b-9a9e80bfa706-kube-api-access-tlbk2\") pod \"glance-f769-account-create-update-25m5x\" (UID: \"890a044b-0060-4feb-866b-9a9e80bfa706\") " pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.510115 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4wn5\" (UniqueName: \"kubernetes.io/projected/a2c0e289-4e3b-4b5a-93db-d38621a870ec-kube-api-access-t4wn5\") pod \"glance-db-create-7ktnd\" (UID: \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\") " pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.510151 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/890a044b-0060-4feb-866b-9a9e80bfa706-operator-scripts\") pod \"glance-f769-account-create-update-25m5x\" (UID: \"890a044b-0060-4feb-866b-9a9e80bfa706\") " pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.511047 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c0e289-4e3b-4b5a-93db-d38621a870ec-operator-scripts\") pod \"glance-db-create-7ktnd\" (UID: \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\") " pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.529513 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4wn5\" (UniqueName: \"kubernetes.io/projected/a2c0e289-4e3b-4b5a-93db-d38621a870ec-kube-api-access-t4wn5\") pod \"glance-db-create-7ktnd\" (UID: \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\") " pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.570478 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.611515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlbk2\" (UniqueName: \"kubernetes.io/projected/890a044b-0060-4feb-866b-9a9e80bfa706-kube-api-access-tlbk2\") pod \"glance-f769-account-create-update-25m5x\" (UID: \"890a044b-0060-4feb-866b-9a9e80bfa706\") " pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.611620 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/890a044b-0060-4feb-866b-9a9e80bfa706-operator-scripts\") pod \"glance-f769-account-create-update-25m5x\" (UID: \"890a044b-0060-4feb-866b-9a9e80bfa706\") " pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.612673 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/890a044b-0060-4feb-866b-9a9e80bfa706-operator-scripts\") pod \"glance-f769-account-create-update-25m5x\" (UID: \"890a044b-0060-4feb-866b-9a9e80bfa706\") " pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.638678 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlbk2\" (UniqueName: \"kubernetes.io/projected/890a044b-0060-4feb-866b-9a9e80bfa706-kube-api-access-tlbk2\") pod \"glance-f769-account-create-update-25m5x\" (UID: \"890a044b-0060-4feb-866b-9a9e80bfa706\") " pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:36 crc kubenswrapper[4795]: I0219 21:45:36.682967 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:37 crc kubenswrapper[4795]: I0219 21:45:37.081911 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7ktnd"] Feb 19 21:45:37 crc kubenswrapper[4795]: W0219 21:45:37.083116 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2c0e289_4e3b_4b5a_93db_d38621a870ec.slice/crio-f4a0b272510dcb48f64a7a1f3dfc756614d3c0e18ddd17eec3be36dac261739e WatchSource:0}: Error finding container f4a0b272510dcb48f64a7a1f3dfc756614d3c0e18ddd17eec3be36dac261739e: Status 404 returned error can't find the container with id f4a0b272510dcb48f64a7a1f3dfc756614d3c0e18ddd17eec3be36dac261739e Feb 19 21:45:37 crc kubenswrapper[4795]: I0219 21:45:37.156595 4795 generic.go:334] "Generic (PLEG): container finished" podID="f8945f31-b1d9-4c65-9f8c-2619f87d4237" containerID="2c21dc2092bc9760ef50273deb040b6251a319d32c5e2c0fdd3bf1678ba55094" exitCode=0 Feb 19 21:45:37 crc kubenswrapper[4795]: I0219 21:45:37.156683 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xmbg2" event={"ID":"f8945f31-b1d9-4c65-9f8c-2619f87d4237","Type":"ContainerDied","Data":"2c21dc2092bc9760ef50273deb040b6251a319d32c5e2c0fdd3bf1678ba55094"} Feb 19 21:45:37 crc kubenswrapper[4795]: I0219 21:45:37.157798 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7ktnd" event={"ID":"a2c0e289-4e3b-4b5a-93db-d38621a870ec","Type":"ContainerStarted","Data":"f4a0b272510dcb48f64a7a1f3dfc756614d3c0e18ddd17eec3be36dac261739e"} Feb 19 21:45:37 crc kubenswrapper[4795]: I0219 21:45:37.231216 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f769-account-create-update-25m5x"] Feb 19 21:45:37 crc kubenswrapper[4795]: I0219 21:45:37.644210 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-w9fbs" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" probeResult="failure" output=< Feb 19 21:45:37 crc kubenswrapper[4795]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 21:45:37 crc kubenswrapper[4795]: > Feb 19 21:45:37 crc kubenswrapper[4795]: I0219 21:45:37.980324 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jlwf9"] Feb 19 21:45:37 crc kubenswrapper[4795]: I0219 21:45:37.985223 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jlwf9"] Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.078894 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jr6xc"] Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.080123 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.082188 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.099290 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jr6xc"] Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.174186 4795 generic.go:334] "Generic (PLEG): container finished" podID="890a044b-0060-4feb-866b-9a9e80bfa706" containerID="f13d7ab12cdcba95909b27dcd1c1e77cc3443dbab3be6042ac8015c2168e9280" exitCode=0 Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.174278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f769-account-create-update-25m5x" event={"ID":"890a044b-0060-4feb-866b-9a9e80bfa706","Type":"ContainerDied","Data":"f13d7ab12cdcba95909b27dcd1c1e77cc3443dbab3be6042ac8015c2168e9280"} Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.174323 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f769-account-create-update-25m5x" event={"ID":"890a044b-0060-4feb-866b-9a9e80bfa706","Type":"ContainerStarted","Data":"62a552ba7e9d49668b066ddc893227489242cacf8fb293f6b71e0d3a5c13b2b2"} Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.176131 4795 generic.go:334] "Generic (PLEG): container finished" podID="a2c0e289-4e3b-4b5a-93db-d38621a870ec" containerID="00ef13a6fc1812f0a18c09ccfd124e49c733a0a626dc7ae23746169010b14516" exitCode=0 Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.176280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7ktnd" event={"ID":"a2c0e289-4e3b-4b5a-93db-d38621a870ec","Type":"ContainerDied","Data":"00ef13a6fc1812f0a18c09ccfd124e49c733a0a626dc7ae23746169010b14516"} Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.242662 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dvbp\" (UniqueName: \"kubernetes.io/projected/5b15ba11-a170-4fac-bac1-15ecf9de7379-kube-api-access-7dvbp\") pod \"root-account-create-update-jr6xc\" (UID: \"5b15ba11-a170-4fac-bac1-15ecf9de7379\") " pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.242776 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b15ba11-a170-4fac-bac1-15ecf9de7379-operator-scripts\") pod \"root-account-create-update-jr6xc\" (UID: \"5b15ba11-a170-4fac-bac1-15ecf9de7379\") " pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.343711 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b15ba11-a170-4fac-bac1-15ecf9de7379-operator-scripts\") pod \"root-account-create-update-jr6xc\" (UID: \"5b15ba11-a170-4fac-bac1-15ecf9de7379\") " pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.343800 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dvbp\" (UniqueName: \"kubernetes.io/projected/5b15ba11-a170-4fac-bac1-15ecf9de7379-kube-api-access-7dvbp\") pod \"root-account-create-update-jr6xc\" (UID: \"5b15ba11-a170-4fac-bac1-15ecf9de7379\") " pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.345380 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b15ba11-a170-4fac-bac1-15ecf9de7379-operator-scripts\") pod \"root-account-create-update-jr6xc\" (UID: \"5b15ba11-a170-4fac-bac1-15ecf9de7379\") " pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.361147 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dvbp\" (UniqueName: \"kubernetes.io/projected/5b15ba11-a170-4fac-bac1-15ecf9de7379-kube-api-access-7dvbp\") pod \"root-account-create-update-jr6xc\" (UID: \"5b15ba11-a170-4fac-bac1-15ecf9de7379\") " pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.434480 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.527888 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.659353 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f8945f31-b1d9-4c65-9f8c-2619f87d4237-etc-swift\") pod \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.659399 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svtdp\" (UniqueName: \"kubernetes.io/projected/f8945f31-b1d9-4c65-9f8c-2619f87d4237-kube-api-access-svtdp\") pod \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.659460 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-swiftconf\") pod \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.659522 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-scripts\") pod \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.659556 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-combined-ca-bundle\") pod \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.659654 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-dispersionconf\") pod \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.659691 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-ring-data-devices\") pod \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\" (UID: \"f8945f31-b1d9-4c65-9f8c-2619f87d4237\") " Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.660430 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f8945f31-b1d9-4c65-9f8c-2619f87d4237" (UID: "f8945f31-b1d9-4c65-9f8c-2619f87d4237"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.660804 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8945f31-b1d9-4c65-9f8c-2619f87d4237-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f8945f31-b1d9-4c65-9f8c-2619f87d4237" (UID: "f8945f31-b1d9-4c65-9f8c-2619f87d4237"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.683784 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f8945f31-b1d9-4c65-9f8c-2619f87d4237" (UID: "f8945f31-b1d9-4c65-9f8c-2619f87d4237"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.688311 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f8945f31-b1d9-4c65-9f8c-2619f87d4237" (UID: "f8945f31-b1d9-4c65-9f8c-2619f87d4237"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.690577 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8945f31-b1d9-4c65-9f8c-2619f87d4237-kube-api-access-svtdp" (OuterVolumeSpecName: "kube-api-access-svtdp") pod "f8945f31-b1d9-4c65-9f8c-2619f87d4237" (UID: "f8945f31-b1d9-4c65-9f8c-2619f87d4237"). InnerVolumeSpecName "kube-api-access-svtdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.690777 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-scripts" (OuterVolumeSpecName: "scripts") pod "f8945f31-b1d9-4c65-9f8c-2619f87d4237" (UID: "f8945f31-b1d9-4c65-9f8c-2619f87d4237"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.706682 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8945f31-b1d9-4c65-9f8c-2619f87d4237" (UID: "f8945f31-b1d9-4c65-9f8c-2619f87d4237"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.761641 4795 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.761674 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.761684 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.761695 4795 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f8945f31-b1d9-4c65-9f8c-2619f87d4237-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.761713 4795 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f8945f31-b1d9-4c65-9f8c-2619f87d4237-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.761721 4795 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f8945f31-b1d9-4c65-9f8c-2619f87d4237-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.761729 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svtdp\" (UniqueName: \"kubernetes.io/projected/f8945f31-b1d9-4c65-9f8c-2619f87d4237-kube-api-access-svtdp\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:38 crc kubenswrapper[4795]: W0219 21:45:38.894362 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b15ba11_a170_4fac_bac1_15ecf9de7379.slice/crio-31f177d86dd9cbd251e0070c154b15ebc1a97c0cad3ad1618ee4fd7eb8b37ce3 WatchSource:0}: Error finding container 31f177d86dd9cbd251e0070c154b15ebc1a97c0cad3ad1618ee4fd7eb8b37ce3: Status 404 returned error can't find the container with id 31f177d86dd9cbd251e0070c154b15ebc1a97c0cad3ad1618ee4fd7eb8b37ce3 Feb 19 21:45:38 crc kubenswrapper[4795]: I0219 21:45:38.895965 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jr6xc"] Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.185664 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jr6xc" event={"ID":"5b15ba11-a170-4fac-bac1-15ecf9de7379","Type":"ContainerStarted","Data":"27f5e555e30e338ab9e5ae7facd6ba963cf6336bbb4bec245c099c4afa8bb6a3"} Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.185964 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jr6xc" event={"ID":"5b15ba11-a170-4fac-bac1-15ecf9de7379","Type":"ContainerStarted","Data":"31f177d86dd9cbd251e0070c154b15ebc1a97c0cad3ad1618ee4fd7eb8b37ce3"} Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.187452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xmbg2" event={"ID":"f8945f31-b1d9-4c65-9f8c-2619f87d4237","Type":"ContainerDied","Data":"ca2816f59591a52db0fc74a4911b062bb5362403053e13a03adbbd1512ca5f93"} Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.187493 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca2816f59591a52db0fc74a4911b062bb5362403053e13a03adbbd1512ca5f93" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.187567 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xmbg2" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.216111 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-jr6xc" podStartSLOduration=1.216094648 podStartE2EDuration="1.216094648s" podCreationTimestamp="2026-02-19 21:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:39.20941898 +0000 UTC m=+1050.401936844" watchObservedRunningTime="2026-02-19 21:45:39.216094648 +0000 UTC m=+1050.408612512" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.525649 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11cc466e-0752-46b5-9775-c29748b13724" path="/var/lib/kubelet/pods/11cc466e-0752-46b5-9775-c29748b13724/volumes" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.557640 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.616277 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.700618 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/890a044b-0060-4feb-866b-9a9e80bfa706-operator-scripts\") pod \"890a044b-0060-4feb-866b-9a9e80bfa706\" (UID: \"890a044b-0060-4feb-866b-9a9e80bfa706\") " Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.701284 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlbk2\" (UniqueName: \"kubernetes.io/projected/890a044b-0060-4feb-866b-9a9e80bfa706-kube-api-access-tlbk2\") pod \"890a044b-0060-4feb-866b-9a9e80bfa706\" (UID: \"890a044b-0060-4feb-866b-9a9e80bfa706\") " Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.702330 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/890a044b-0060-4feb-866b-9a9e80bfa706-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "890a044b-0060-4feb-866b-9a9e80bfa706" (UID: "890a044b-0060-4feb-866b-9a9e80bfa706"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.708840 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890a044b-0060-4feb-866b-9a9e80bfa706-kube-api-access-tlbk2" (OuterVolumeSpecName: "kube-api-access-tlbk2") pod "890a044b-0060-4feb-866b-9a9e80bfa706" (UID: "890a044b-0060-4feb-866b-9a9e80bfa706"). InnerVolumeSpecName "kube-api-access-tlbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.802260 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c0e289-4e3b-4b5a-93db-d38621a870ec-operator-scripts\") pod \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\" (UID: \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\") " Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.802297 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4wn5\" (UniqueName: \"kubernetes.io/projected/a2c0e289-4e3b-4b5a-93db-d38621a870ec-kube-api-access-t4wn5\") pod \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\" (UID: \"a2c0e289-4e3b-4b5a-93db-d38621a870ec\") " Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.802620 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/890a044b-0060-4feb-866b-9a9e80bfa706-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.802637 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlbk2\" (UniqueName: \"kubernetes.io/projected/890a044b-0060-4feb-866b-9a9e80bfa706-kube-api-access-tlbk2\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.802822 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2c0e289-4e3b-4b5a-93db-d38621a870ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a2c0e289-4e3b-4b5a-93db-d38621a870ec" (UID: "a2c0e289-4e3b-4b5a-93db-d38621a870ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.805324 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c0e289-4e3b-4b5a-93db-d38621a870ec-kube-api-access-t4wn5" (OuterVolumeSpecName: "kube-api-access-t4wn5") pod "a2c0e289-4e3b-4b5a-93db-d38621a870ec" (UID: "a2c0e289-4e3b-4b5a-93db-d38621a870ec"). InnerVolumeSpecName "kube-api-access-t4wn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.921642 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2c0e289-4e3b-4b5a-93db-d38621a870ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:39 crc kubenswrapper[4795]: I0219 21:45:39.921685 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4wn5\" (UniqueName: \"kubernetes.io/projected/a2c0e289-4e3b-4b5a-93db-d38621a870ec-kube-api-access-t4wn5\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.124581 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.129191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"swift-storage-0\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " pod="openstack/swift-storage-0" Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.189775 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.201934 4795 generic.go:334] "Generic (PLEG): container finished" podID="5b15ba11-a170-4fac-bac1-15ecf9de7379" containerID="27f5e555e30e338ab9e5ae7facd6ba963cf6336bbb4bec245c099c4afa8bb6a3" exitCode=0 Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.202035 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jr6xc" event={"ID":"5b15ba11-a170-4fac-bac1-15ecf9de7379","Type":"ContainerDied","Data":"27f5e555e30e338ab9e5ae7facd6ba963cf6336bbb4bec245c099c4afa8bb6a3"} Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.203768 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f769-account-create-update-25m5x" event={"ID":"890a044b-0060-4feb-866b-9a9e80bfa706","Type":"ContainerDied","Data":"62a552ba7e9d49668b066ddc893227489242cacf8fb293f6b71e0d3a5c13b2b2"} Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.203806 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62a552ba7e9d49668b066ddc893227489242cacf8fb293f6b71e0d3a5c13b2b2" Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.203806 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f769-account-create-update-25m5x" Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.205019 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7ktnd" event={"ID":"a2c0e289-4e3b-4b5a-93db-d38621a870ec","Type":"ContainerDied","Data":"f4a0b272510dcb48f64a7a1f3dfc756614d3c0e18ddd17eec3be36dac261739e"} Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.205037 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4a0b272510dcb48f64a7a1f3dfc756614d3c0e18ddd17eec3be36dac261739e" Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.205083 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7ktnd" Feb 19 21:45:40 crc kubenswrapper[4795]: I0219 21:45:40.810316 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:45:40 crc kubenswrapper[4795]: W0219 21:45:40.812530 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c773ec2_a400_42a9_8784_ed9c295c3bb4.slice/crio-5f46001ee34a633ce141975121f3de2f61c9a83244b699ec2a130f6af5efbc36 WatchSource:0}: Error finding container 5f46001ee34a633ce141975121f3de2f61c9a83244b699ec2a130f6af5efbc36: Status 404 returned error can't find the container with id 5f46001ee34a633ce141975121f3de2f61c9a83244b699ec2a130f6af5efbc36 Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.213235 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerID="90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55" exitCode=0 Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.213325 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b096325-542d-4ac6-8d16-8aa0937013b2","Type":"ContainerDied","Data":"90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55"} Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.214553 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"5f46001ee34a633ce141975121f3de2f61c9a83244b699ec2a130f6af5efbc36"} Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.597134 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-2wbff"] Feb 19 21:45:41 crc kubenswrapper[4795]: E0219 21:45:41.597862 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890a044b-0060-4feb-866b-9a9e80bfa706" containerName="mariadb-account-create-update" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.597878 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="890a044b-0060-4feb-866b-9a9e80bfa706" containerName="mariadb-account-create-update" Feb 19 21:45:41 crc kubenswrapper[4795]: E0219 21:45:41.597896 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c0e289-4e3b-4b5a-93db-d38621a870ec" containerName="mariadb-database-create" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.597904 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c0e289-4e3b-4b5a-93db-d38621a870ec" containerName="mariadb-database-create" Feb 19 21:45:41 crc kubenswrapper[4795]: E0219 21:45:41.597917 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8945f31-b1d9-4c65-9f8c-2619f87d4237" containerName="swift-ring-rebalance" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.597923 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8945f31-b1d9-4c65-9f8c-2619f87d4237" containerName="swift-ring-rebalance" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.598075 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2c0e289-4e3b-4b5a-93db-d38621a870ec" containerName="mariadb-database-create" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.598087 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8945f31-b1d9-4c65-9f8c-2619f87d4237" containerName="swift-ring-rebalance" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.598102 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="890a044b-0060-4feb-866b-9a9e80bfa706" containerName="mariadb-account-create-update" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.598597 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.603177 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.603439 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6rhfn" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.616739 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-2wbff"] Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.757797 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.758940 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-db-sync-config-data\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.758964 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-combined-ca-bundle\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.758988 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-config-data\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.759019 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q9z2\" (UniqueName: \"kubernetes.io/projected/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-kube-api-access-6q9z2\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.860682 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b15ba11-a170-4fac-bac1-15ecf9de7379-operator-scripts\") pod \"5b15ba11-a170-4fac-bac1-15ecf9de7379\" (UID: \"5b15ba11-a170-4fac-bac1-15ecf9de7379\") " Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.860778 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dvbp\" (UniqueName: \"kubernetes.io/projected/5b15ba11-a170-4fac-bac1-15ecf9de7379-kube-api-access-7dvbp\") pod \"5b15ba11-a170-4fac-bac1-15ecf9de7379\" (UID: \"5b15ba11-a170-4fac-bac1-15ecf9de7379\") " Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.861074 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-combined-ca-bundle\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.861093 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-db-sync-config-data\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.861116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-config-data\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.861190 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q9z2\" (UniqueName: \"kubernetes.io/projected/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-kube-api-access-6q9z2\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.861715 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b15ba11-a170-4fac-bac1-15ecf9de7379-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b15ba11-a170-4fac-bac1-15ecf9de7379" (UID: "5b15ba11-a170-4fac-bac1-15ecf9de7379"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.865527 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-config-data\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.865989 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-combined-ca-bundle\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.866353 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b15ba11-a170-4fac-bac1-15ecf9de7379-kube-api-access-7dvbp" (OuterVolumeSpecName: "kube-api-access-7dvbp") pod "5b15ba11-a170-4fac-bac1-15ecf9de7379" (UID: "5b15ba11-a170-4fac-bac1-15ecf9de7379"). InnerVolumeSpecName "kube-api-access-7dvbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.875070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-db-sync-config-data\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.890804 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q9z2\" (UniqueName: \"kubernetes.io/projected/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-kube-api-access-6q9z2\") pod \"glance-db-sync-2wbff\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.918494 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2wbff" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.963865 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b15ba11-a170-4fac-bac1-15ecf9de7379-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:41 crc kubenswrapper[4795]: I0219 21:45:41.963903 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dvbp\" (UniqueName: \"kubernetes.io/projected/5b15ba11-a170-4fac-bac1-15ecf9de7379-kube-api-access-7dvbp\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.226807 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b096325-542d-4ac6-8d16-8aa0937013b2","Type":"ContainerStarted","Data":"65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a"} Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.227023 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.229777 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jr6xc" event={"ID":"5b15ba11-a170-4fac-bac1-15ecf9de7379","Type":"ContainerDied","Data":"31f177d86dd9cbd251e0070c154b15ebc1a97c0cad3ad1618ee4fd7eb8b37ce3"} Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.230048 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31f177d86dd9cbd251e0070c154b15ebc1a97c0cad3ad1618ee4fd7eb8b37ce3" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.229870 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jr6xc" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.259605 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371970.595188 podStartE2EDuration="1m6.259588432s" podCreationTimestamp="2026-02-19 21:44:36 +0000 UTC" firstStartedPulling="2026-02-19 21:44:37.978295277 +0000 UTC m=+989.170813141" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:42.252293187 +0000 UTC m=+1053.444811051" watchObservedRunningTime="2026-02-19 21:45:42.259588432 +0000 UTC m=+1053.452106296" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.556745 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-2wbff"] Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.646221 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-w9fbs" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" probeResult="failure" output=< Feb 19 21:45:42 crc kubenswrapper[4795]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 21:45:42 crc kubenswrapper[4795]: > Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.685194 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.695567 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.923637 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-w9fbs-config-d9rr5"] Feb 19 21:45:42 crc kubenswrapper[4795]: E0219 21:45:42.924363 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b15ba11-a170-4fac-bac1-15ecf9de7379" containerName="mariadb-account-create-update" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.924376 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b15ba11-a170-4fac-bac1-15ecf9de7379" containerName="mariadb-account-create-update" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.924535 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b15ba11-a170-4fac-bac1-15ecf9de7379" containerName="mariadb-account-create-update" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.925042 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.928573 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 19 21:45:42 crc kubenswrapper[4795]: I0219 21:45:42.931661 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-w9fbs-config-d9rr5"] Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.081802 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run-ovn\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.081863 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-additional-scripts\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.081891 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.082073 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-scripts\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.082155 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-log-ovn\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.082266 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d8gs\" (UniqueName: \"kubernetes.io/projected/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-kube-api-access-4d8gs\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184271 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d8gs\" (UniqueName: \"kubernetes.io/projected/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-kube-api-access-4d8gs\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184363 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run-ovn\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184397 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-additional-scripts\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184439 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-scripts\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184508 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-log-ovn\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-log-ovn\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184831 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run-ovn\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.184874 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.185455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-additional-scripts\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.186364 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-scripts\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.204719 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d8gs\" (UniqueName: \"kubernetes.io/projected/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-kube-api-access-4d8gs\") pod \"ovn-controller-w9fbs-config-d9rr5\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.245105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"51c55baa52f08bcb95276c0f7a67a7ef348b9bd02a9fc401f50e679f37e0c117"} Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.246906 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2wbff" event={"ID":"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8","Type":"ContainerStarted","Data":"256eb7cf5acf7b851870ab83b4723dabaee1fbaa60ee04f8ea61103c387af2ad"} Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.329095 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:43 crc kubenswrapper[4795]: I0219 21:45:43.819633 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-w9fbs-config-d9rr5"] Feb 19 21:45:44 crc kubenswrapper[4795]: I0219 21:45:44.262717 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w9fbs-config-d9rr5" event={"ID":"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd","Type":"ContainerStarted","Data":"7c32d427d23010f8ec7644388fc5c6ced9ee00bbf0a6575354abad70dc15498d"} Feb 19 21:45:44 crc kubenswrapper[4795]: I0219 21:45:44.263009 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w9fbs-config-d9rr5" event={"ID":"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd","Type":"ContainerStarted","Data":"65876dc5d7abd6c76f19edc3979a74c5098fbab43150468cdad7e42ab8b1ff48"} Feb 19 21:45:44 crc kubenswrapper[4795]: I0219 21:45:44.265892 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"b4b0474dd3a4192273fa0b2dc273a792e955cd0dde33e24a1afa65bb56656eaa"} Feb 19 21:45:44 crc kubenswrapper[4795]: I0219 21:45:44.265934 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"d2281de4777acfa86c800d06ed4c2e0ac8613cf4008b8449cd7089d057ee51ec"} Feb 19 21:45:44 crc kubenswrapper[4795]: I0219 21:45:44.265944 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"22441008e17545864d7c6366d4ab4fa8333a1c04e36b9961e8fdfdfaeec8b1b6"} Feb 19 21:45:44 crc kubenswrapper[4795]: I0219 21:45:44.282312 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-w9fbs-config-d9rr5" podStartSLOduration=2.282298148 podStartE2EDuration="2.282298148s" podCreationTimestamp="2026-02-19 21:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:44.281962818 +0000 UTC m=+1055.474480672" watchObservedRunningTime="2026-02-19 21:45:44.282298148 +0000 UTC m=+1055.474816002" Feb 19 21:45:45 crc kubenswrapper[4795]: I0219 21:45:45.279123 4795 generic.go:334] "Generic (PLEG): container finished" podID="478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" containerID="7c32d427d23010f8ec7644388fc5c6ced9ee00bbf0a6575354abad70dc15498d" exitCode=0 Feb 19 21:45:45 crc kubenswrapper[4795]: I0219 21:45:45.279376 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w9fbs-config-d9rr5" event={"ID":"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd","Type":"ContainerDied","Data":"7c32d427d23010f8ec7644388fc5c6ced9ee00bbf0a6575354abad70dc15498d"} Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.289460 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"f2b55f40d9ab92e06fdc09f65e72764f7f9c63fbb1f126ede2058624236d001f"} Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.289773 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"5019fca913d092cd1d004e058553c364ac08007bafeae027b681e3bb6eb59026"} Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.289784 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"be1c394688c447ed772b4929317159025a1e97491b40b847644ed369351532b5"} Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.289795 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"a86db1a02ddab5086097179b35e6f17d71c32f36157625ee70912b23839603d4"} Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.617705 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.762969 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d8gs\" (UniqueName: \"kubernetes.io/projected/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-kube-api-access-4d8gs\") pod \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763108 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run\") pod \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763204 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run-ovn\") pod \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763224 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run" (OuterVolumeSpecName: "var-run") pod "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" (UID: "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763254 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-scripts\") pod \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763326 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" (UID: "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763377 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-additional-scripts\") pod \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763433 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-log-ovn\") pod \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\" (UID: \"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd\") " Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763541 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" (UID: "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763911 4795 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763929 4795 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763941 4795 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.763958 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" (UID: "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.764135 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-scripts" (OuterVolumeSpecName: "scripts") pod "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" (UID: "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.768688 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-kube-api-access-4d8gs" (OuterVolumeSpecName: "kube-api-access-4d8gs") pod "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" (UID: "478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd"). InnerVolumeSpecName "kube-api-access-4d8gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.865653 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.865679 4795 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:46 crc kubenswrapper[4795]: I0219 21:45:46.865690 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d8gs\" (UniqueName: \"kubernetes.io/projected/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd-kube-api-access-4d8gs\") on node \"crc\" DevicePath \"\"" Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.301207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w9fbs-config-d9rr5" event={"ID":"478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd","Type":"ContainerDied","Data":"65876dc5d7abd6c76f19edc3979a74c5098fbab43150468cdad7e42ab8b1ff48"} Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.301583 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65876dc5d7abd6c76f19edc3979a74c5098fbab43150468cdad7e42ab8b1ff48" Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.301662 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w9fbs-config-d9rr5" Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.315704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"c8d438309dbe4ee742eb9d7a2b93e755a74c9ba2dd39409dcf7caf84dee6405a"} Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.315742 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"c5d0640985105b2140d43ecf956f5621f6e82eb5a3d40d95fb3d09d303406c84"} Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.405877 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-w9fbs-config-d9rr5"] Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.415472 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-w9fbs-config-d9rr5"] Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.521700 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" path="/var/lib/kubelet/pods/478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd/volumes" Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.653938 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-w9fbs" Feb 19 21:45:47 crc kubenswrapper[4795]: I0219 21:45:47.936351 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.333107 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"955740cbd5ac4eda735378957980240001d5c0ce0905f2fca18b4155f3fb6c98"} Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.333667 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"bb960073c3b2955d7aa2d18d3eb2e0958e7e98f4cd499d7077f5064d1e43a05e"} Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.333681 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"8d3a569ab5140e595996d2c82fd170ed28aa9420de4fdae36e9b5854b2e0bd5e"} Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.333691 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"2ce7f7a343a6c79fd57a6b4c7cea8f6f21ccfbc5ea5261b4c592c5cc2035910e"} Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.333703 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerStarted","Data":"cb0176a835c07bb843ea9834f19b5792b6d9700c5cd61a140ec8b99a66854f5f"} Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.669355 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.519956086 podStartE2EDuration="25.669335813s" podCreationTimestamp="2026-02-19 21:45:23 +0000 UTC" firstStartedPulling="2026-02-19 21:45:40.815880836 +0000 UTC m=+1052.008398700" lastFinishedPulling="2026-02-19 21:45:46.965260563 +0000 UTC m=+1058.157778427" observedRunningTime="2026-02-19 21:45:48.381026682 +0000 UTC m=+1059.573544546" watchObservedRunningTime="2026-02-19 21:45:48.669335813 +0000 UTC m=+1059.861853677" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.675508 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-768666cd57-5c55b"] Feb 19 21:45:48 crc kubenswrapper[4795]: E0219 21:45:48.675845 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" containerName="ovn-config" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.675861 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" containerName="ovn-config" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.676027 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="478e80b5-04c6-4b1a-8c6d-c30ea45ba9fd" containerName="ovn-config" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.676979 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.680357 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.687000 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-5c55b"] Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.814528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-nb\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.814598 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-svc\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.814790 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-sb\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.814900 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w95wm\" (UniqueName: \"kubernetes.io/projected/2f384824-f8ad-42d9-b09b-decb5280b448-kube-api-access-w95wm\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.814991 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-config\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.815025 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-swift-storage-0\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.916702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-nb\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.916762 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-svc\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.916788 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-sb\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.916819 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w95wm\" (UniqueName: \"kubernetes.io/projected/2f384824-f8ad-42d9-b09b-decb5280b448-kube-api-access-w95wm\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.916851 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-config\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.916871 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-swift-storage-0\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.917711 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-swift-storage-0\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.918325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-nb\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.919503 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-svc\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.920018 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-config\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.920110 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-sb\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:48 crc kubenswrapper[4795]: I0219 21:45:48.942596 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w95wm\" (UniqueName: \"kubernetes.io/projected/2f384824-f8ad-42d9-b09b-decb5280b448-kube-api-access-w95wm\") pod \"dnsmasq-dns-768666cd57-5c55b\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:49 crc kubenswrapper[4795]: I0219 21:45:49.036755 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:49 crc kubenswrapper[4795]: I0219 21:45:49.479648 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-5c55b"] Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.446587 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.772052 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-snb69"] Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.773333 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-snb69" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.786838 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-snb69"] Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.863606 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9f51-account-create-update-n57zq"] Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.864717 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.867146 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.886292 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9f51-account-create-update-n57zq"] Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.893062 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88vvr\" (UniqueName: \"kubernetes.io/projected/18561896-d336-4962-8e9e-4ccf748f8605-kube-api-access-88vvr\") pod \"cinder-db-create-snb69\" (UID: \"18561896-d336-4962-8e9e-4ccf748f8605\") " pod="openstack/cinder-db-create-snb69" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.893221 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18561896-d336-4962-8e9e-4ccf748f8605-operator-scripts\") pod \"cinder-db-create-snb69\" (UID: \"18561896-d336-4962-8e9e-4ccf748f8605\") " pod="openstack/cinder-db-create-snb69" Feb 19 21:45:57 crc kubenswrapper[4795]: W0219 21:45:57.966992 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f384824_f8ad_42d9_b09b_decb5280b448.slice/crio-13682a647f5a86053553d890ba06f71e1b6b17acba7850bc7912292cc1d6509d WatchSource:0}: Error finding container 13682a647f5a86053553d890ba06f71e1b6b17acba7850bc7912292cc1d6509d: Status 404 returned error can't find the container with id 13682a647f5a86053553d890ba06f71e1b6b17acba7850bc7912292cc1d6509d Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.968996 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-vlmnn"] Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.970017 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vlmnn" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.975283 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vlmnn"] Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.995793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88vvr\" (UniqueName: \"kubernetes.io/projected/18561896-d336-4962-8e9e-4ccf748f8605-kube-api-access-88vvr\") pod \"cinder-db-create-snb69\" (UID: \"18561896-d336-4962-8e9e-4ccf748f8605\") " pod="openstack/cinder-db-create-snb69" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.995865 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d152069-2c3d-4cf4-94e8-3068e24def9f-operator-scripts\") pod \"cinder-9f51-account-create-update-n57zq\" (UID: \"7d152069-2c3d-4cf4-94e8-3068e24def9f\") " pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.995916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v72cs\" (UniqueName: \"kubernetes.io/projected/7d152069-2c3d-4cf4-94e8-3068e24def9f-kube-api-access-v72cs\") pod \"cinder-9f51-account-create-update-n57zq\" (UID: \"7d152069-2c3d-4cf4-94e8-3068e24def9f\") " pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.995966 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18561896-d336-4962-8e9e-4ccf748f8605-operator-scripts\") pod \"cinder-db-create-snb69\" (UID: \"18561896-d336-4962-8e9e-4ccf748f8605\") " pod="openstack/cinder-db-create-snb69" Feb 19 21:45:57 crc kubenswrapper[4795]: I0219 21:45:57.996608 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18561896-d336-4962-8e9e-4ccf748f8605-operator-scripts\") pod \"cinder-db-create-snb69\" (UID: \"18561896-d336-4962-8e9e-4ccf748f8605\") " pod="openstack/cinder-db-create-snb69" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.017505 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88vvr\" (UniqueName: \"kubernetes.io/projected/18561896-d336-4962-8e9e-4ccf748f8605-kube-api-access-88vvr\") pod \"cinder-db-create-snb69\" (UID: \"18561896-d336-4962-8e9e-4ccf748f8605\") " pod="openstack/cinder-db-create-snb69" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.075679 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d602-account-create-update-mc6fv"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.076667 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.078529 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.091279 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d602-account-create-update-mc6fv"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.091484 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-snb69" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.097418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hxjx\" (UniqueName: \"kubernetes.io/projected/73f01f44-1467-442f-b91f-ac1765626a3d-kube-api-access-9hxjx\") pod \"barbican-db-create-vlmnn\" (UID: \"73f01f44-1467-442f-b91f-ac1765626a3d\") " pod="openstack/barbican-db-create-vlmnn" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.097576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d152069-2c3d-4cf4-94e8-3068e24def9f-operator-scripts\") pod \"cinder-9f51-account-create-update-n57zq\" (UID: \"7d152069-2c3d-4cf4-94e8-3068e24def9f\") " pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.097648 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v72cs\" (UniqueName: \"kubernetes.io/projected/7d152069-2c3d-4cf4-94e8-3068e24def9f-kube-api-access-v72cs\") pod \"cinder-9f51-account-create-update-n57zq\" (UID: \"7d152069-2c3d-4cf4-94e8-3068e24def9f\") " pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.097687 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73f01f44-1467-442f-b91f-ac1765626a3d-operator-scripts\") pod \"barbican-db-create-vlmnn\" (UID: \"73f01f44-1467-442f-b91f-ac1765626a3d\") " pod="openstack/barbican-db-create-vlmnn" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.098460 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d152069-2c3d-4cf4-94e8-3068e24def9f-operator-scripts\") pod \"cinder-9f51-account-create-update-n57zq\" (UID: \"7d152069-2c3d-4cf4-94e8-3068e24def9f\") " pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.116292 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v72cs\" (UniqueName: \"kubernetes.io/projected/7d152069-2c3d-4cf4-94e8-3068e24def9f-kube-api-access-v72cs\") pod \"cinder-9f51-account-create-update-n57zq\" (UID: \"7d152069-2c3d-4cf4-94e8-3068e24def9f\") " pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.151355 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-dxql7"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.152248 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.157889 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.158259 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.158630 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8pz2x" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.165388 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.182519 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.190917 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dxql7"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.199273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hxjx\" (UniqueName: \"kubernetes.io/projected/73f01f44-1467-442f-b91f-ac1765626a3d-kube-api-access-9hxjx\") pod \"barbican-db-create-vlmnn\" (UID: \"73f01f44-1467-442f-b91f-ac1765626a3d\") " pod="openstack/barbican-db-create-vlmnn" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.199325 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf27b\" (UniqueName: \"kubernetes.io/projected/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-kube-api-access-vf27b\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.199369 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-config-data\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.199387 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57961551-d4f8-4586-b255-8810fbdb499a-operator-scripts\") pod \"barbican-d602-account-create-update-mc6fv\" (UID: \"57961551-d4f8-4586-b255-8810fbdb499a\") " pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.199433 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kkx2\" (UniqueName: \"kubernetes.io/projected/57961551-d4f8-4586-b255-8810fbdb499a-kube-api-access-4kkx2\") pod \"barbican-d602-account-create-update-mc6fv\" (UID: \"57961551-d4f8-4586-b255-8810fbdb499a\") " pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.199591 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-combined-ca-bundle\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.199728 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73f01f44-1467-442f-b91f-ac1765626a3d-operator-scripts\") pod \"barbican-db-create-vlmnn\" (UID: \"73f01f44-1467-442f-b91f-ac1765626a3d\") " pod="openstack/barbican-db-create-vlmnn" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.200521 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73f01f44-1467-442f-b91f-ac1765626a3d-operator-scripts\") pod \"barbican-db-create-vlmnn\" (UID: \"73f01f44-1467-442f-b91f-ac1765626a3d\") " pod="openstack/barbican-db-create-vlmnn" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.210997 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-8jt8c"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.212020 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8jt8c" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.226673 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hxjx\" (UniqueName: \"kubernetes.io/projected/73f01f44-1467-442f-b91f-ac1765626a3d-kube-api-access-9hxjx\") pod \"barbican-db-create-vlmnn\" (UID: \"73f01f44-1467-442f-b91f-ac1765626a3d\") " pod="openstack/barbican-db-create-vlmnn" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.251182 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8jt8c"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.295517 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7c93-account-create-update-ptrqq"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.298064 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.301131 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-combined-ca-bundle\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.301208 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-operator-scripts\") pod \"neutron-db-create-8jt8c\" (UID: \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\") " pod="openstack/neutron-db-create-8jt8c" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.301293 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf27b\" (UniqueName: \"kubernetes.io/projected/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-kube-api-access-vf27b\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.301324 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scxqb\" (UniqueName: \"kubernetes.io/projected/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-kube-api-access-scxqb\") pod \"neutron-db-create-8jt8c\" (UID: \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\") " pod="openstack/neutron-db-create-8jt8c" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.301346 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-config-data\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.301363 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57961551-d4f8-4586-b255-8810fbdb499a-operator-scripts\") pod \"barbican-d602-account-create-update-mc6fv\" (UID: \"57961551-d4f8-4586-b255-8810fbdb499a\") " pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.301534 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kkx2\" (UniqueName: \"kubernetes.io/projected/57961551-d4f8-4586-b255-8810fbdb499a-kube-api-access-4kkx2\") pod \"barbican-d602-account-create-update-mc6fv\" (UID: \"57961551-d4f8-4586-b255-8810fbdb499a\") " pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.301972 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57961551-d4f8-4586-b255-8810fbdb499a-operator-scripts\") pod \"barbican-d602-account-create-update-mc6fv\" (UID: \"57961551-d4f8-4586-b255-8810fbdb499a\") " pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.304630 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c93-account-create-update-ptrqq"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.305010 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.305106 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-combined-ca-bundle\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.316797 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-config-data\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.323761 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kkx2\" (UniqueName: \"kubernetes.io/projected/57961551-d4f8-4586-b255-8810fbdb499a-kube-api-access-4kkx2\") pod \"barbican-d602-account-create-update-mc6fv\" (UID: \"57961551-d4f8-4586-b255-8810fbdb499a\") " pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.325018 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf27b\" (UniqueName: \"kubernetes.io/projected/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-kube-api-access-vf27b\") pod \"keystone-db-sync-dxql7\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.402684 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-operator-scripts\") pod \"neutron-db-create-8jt8c\" (UID: \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\") " pod="openstack/neutron-db-create-8jt8c" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.402741 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb700b2-4c29-4deb-a379-d18f2695dcaf-operator-scripts\") pod \"neutron-7c93-account-create-update-ptrqq\" (UID: \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\") " pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.402804 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfjgm\" (UniqueName: \"kubernetes.io/projected/4cb700b2-4c29-4deb-a379-d18f2695dcaf-kube-api-access-rfjgm\") pod \"neutron-7c93-account-create-update-ptrqq\" (UID: \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\") " pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.402848 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scxqb\" (UniqueName: \"kubernetes.io/projected/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-kube-api-access-scxqb\") pod \"neutron-db-create-8jt8c\" (UID: \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\") " pod="openstack/neutron-db-create-8jt8c" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.405359 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-operator-scripts\") pod \"neutron-db-create-8jt8c\" (UID: \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\") " pod="openstack/neutron-db-create-8jt8c" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.425714 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scxqb\" (UniqueName: \"kubernetes.io/projected/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-kube-api-access-scxqb\") pod \"neutron-db-create-8jt8c\" (UID: \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\") " pod="openstack/neutron-db-create-8jt8c" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.442977 4795 generic.go:334] "Generic (PLEG): container finished" podID="2f384824-f8ad-42d9-b09b-decb5280b448" containerID="269c23977ee5c8c89a33c0c6142172272e66947879bd10de823a2ea6ed071bb6" exitCode=0 Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.443028 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-5c55b" event={"ID":"2f384824-f8ad-42d9-b09b-decb5280b448","Type":"ContainerDied","Data":"269c23977ee5c8c89a33c0c6142172272e66947879bd10de823a2ea6ed071bb6"} Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.443061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-5c55b" event={"ID":"2f384824-f8ad-42d9-b09b-decb5280b448","Type":"ContainerStarted","Data":"13682a647f5a86053553d890ba06f71e1b6b17acba7850bc7912292cc1d6509d"} Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.446923 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-snb69"] Feb 19 21:45:58 crc kubenswrapper[4795]: W0219 21:45:58.448386 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18561896_d336_4962_8e9e_4ccf748f8605.slice/crio-237556236a6852992de9809aa75761000b557b98fe94e76438ed8fb257b1f0d3 WatchSource:0}: Error finding container 237556236a6852992de9809aa75761000b557b98fe94e76438ed8fb257b1f0d3: Status 404 returned error can't find the container with id 237556236a6852992de9809aa75761000b557b98fe94e76438ed8fb257b1f0d3 Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.494683 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vlmnn" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.503627 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfjgm\" (UniqueName: \"kubernetes.io/projected/4cb700b2-4c29-4deb-a379-d18f2695dcaf-kube-api-access-rfjgm\") pod \"neutron-7c93-account-create-update-ptrqq\" (UID: \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\") " pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.503801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb700b2-4c29-4deb-a379-d18f2695dcaf-operator-scripts\") pod \"neutron-7c93-account-create-update-ptrqq\" (UID: \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\") " pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.504851 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb700b2-4c29-4deb-a379-d18f2695dcaf-operator-scripts\") pod \"neutron-7c93-account-create-update-ptrqq\" (UID: \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\") " pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.511493 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.518589 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dxql7" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.520009 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfjgm\" (UniqueName: \"kubernetes.io/projected/4cb700b2-4c29-4deb-a379-d18f2695dcaf-kube-api-access-rfjgm\") pod \"neutron-7c93-account-create-update-ptrqq\" (UID: \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\") " pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.546935 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8jt8c" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.722155 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.781213 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9f51-account-create-update-n57zq"] Feb 19 21:45:58 crc kubenswrapper[4795]: I0219 21:45:58.861695 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vlmnn"] Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.146028 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8jt8c"] Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.154963 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d602-account-create-update-mc6fv"] Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.255354 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dxql7"] Feb 19 21:45:59 crc kubenswrapper[4795]: W0219 21:45:59.267110 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc70b8f2_4f1b_4b6e_b657_66aac1cbfa23.slice/crio-884276ae22779ec15ffcd93cb5953527ba29f916cfb173ab2b7943b9d65999d4 WatchSource:0}: Error finding container 884276ae22779ec15ffcd93cb5953527ba29f916cfb173ab2b7943b9d65999d4: Status 404 returned error can't find the container with id 884276ae22779ec15ffcd93cb5953527ba29f916cfb173ab2b7943b9d65999d4 Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.358926 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c93-account-create-update-ptrqq"] Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.456084 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d602-account-create-update-mc6fv" event={"ID":"57961551-d4f8-4586-b255-8810fbdb499a","Type":"ContainerStarted","Data":"122378ef931bbdce5fbd1c31b593170eeae5e5fed8a74b40fbb0d536de3a3d22"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.456127 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d602-account-create-update-mc6fv" event={"ID":"57961551-d4f8-4586-b255-8810fbdb499a","Type":"ContainerStarted","Data":"22d7d81ad8b9d1148d11d4477982eeefb07c589c029993c9da669c9f0edf5d61"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.469754 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2wbff" event={"ID":"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8","Type":"ContainerStarted","Data":"f3e360ac25dc639935c61b2baa9937c7f07b88ebec95aede182ab25a4907955c"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.473979 4795 generic.go:334] "Generic (PLEG): container finished" podID="18561896-d336-4962-8e9e-4ccf748f8605" containerID="6b0c6da228cd7d133c73866bc31005f08b881ce2ffda7a1dc8905fe2cbf580b7" exitCode=0 Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.474081 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-snb69" event={"ID":"18561896-d336-4962-8e9e-4ccf748f8605","Type":"ContainerDied","Data":"6b0c6da228cd7d133c73866bc31005f08b881ce2ffda7a1dc8905fe2cbf580b7"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.474105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-snb69" event={"ID":"18561896-d336-4962-8e9e-4ccf748f8605","Type":"ContainerStarted","Data":"237556236a6852992de9809aa75761000b557b98fe94e76438ed8fb257b1f0d3"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.476253 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-d602-account-create-update-mc6fv" podStartSLOduration=1.476237266 podStartE2EDuration="1.476237266s" podCreationTimestamp="2026-02-19 21:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:59.474462245 +0000 UTC m=+1070.666980109" watchObservedRunningTime="2026-02-19 21:45:59.476237266 +0000 UTC m=+1070.668755130" Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.479807 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dxql7" event={"ID":"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23","Type":"ContainerStarted","Data":"884276ae22779ec15ffcd93cb5953527ba29f916cfb173ab2b7943b9d65999d4"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.483223 4795 generic.go:334] "Generic (PLEG): container finished" podID="73f01f44-1467-442f-b91f-ac1765626a3d" containerID="464384efe0cd85358add102341feabf48351464a71bfc2ad9ed2ce3ea45a4a94" exitCode=0 Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.483312 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vlmnn" event={"ID":"73f01f44-1467-442f-b91f-ac1765626a3d","Type":"ContainerDied","Data":"464384efe0cd85358add102341feabf48351464a71bfc2ad9ed2ce3ea45a4a94"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.483362 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vlmnn" event={"ID":"73f01f44-1467-442f-b91f-ac1765626a3d","Type":"ContainerStarted","Data":"0abef5e7947f0e3cbddf50027fa385aebc0f4b6ccd2ad482c073299aac2c5e2e"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.484653 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c93-account-create-update-ptrqq" event={"ID":"4cb700b2-4c29-4deb-a379-d18f2695dcaf","Type":"ContainerStarted","Data":"3989bba1eed50aec45faea3f156750277605314604317a5496d6bb8b3902c171"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.490919 4795 generic.go:334] "Generic (PLEG): container finished" podID="7d152069-2c3d-4cf4-94e8-3068e24def9f" containerID="62e8bfd862eff78d929edc90f98ab271d6cbd53192119320b7d77f0dddb03767" exitCode=0 Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.490980 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f51-account-create-update-n57zq" event={"ID":"7d152069-2c3d-4cf4-94e8-3068e24def9f","Type":"ContainerDied","Data":"62e8bfd862eff78d929edc90f98ab271d6cbd53192119320b7d77f0dddb03767"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.491003 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f51-account-create-update-n57zq" event={"ID":"7d152069-2c3d-4cf4-94e8-3068e24def9f","Type":"ContainerStarted","Data":"a0c267ff5a8f3bc00149f43c149c6b31e9b75bde6755d31bd67947fa2475dd75"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.492665 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8jt8c" event={"ID":"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c","Type":"ContainerStarted","Data":"83550b0e412e3c0f2147b2389e1c3220efde13a4057f9e67e612e0461d3172fc"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.492697 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8jt8c" event={"ID":"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c","Type":"ContainerStarted","Data":"6e153c4a49c239a97a8eb900ea2dbb494088e83e1efd11bec5254c0600bc3983"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.505926 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-5c55b" event={"ID":"2f384824-f8ad-42d9-b09b-decb5280b448","Type":"ContainerStarted","Data":"2ad1ba517bb7be9e1a9d1797c2af47441870593656bca08175b2e33f72c8f9fc"} Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.506493 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.514730 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-2wbff" podStartSLOduration=3.085979002 podStartE2EDuration="18.514709771s" podCreationTimestamp="2026-02-19 21:45:41 +0000 UTC" firstStartedPulling="2026-02-19 21:45:42.819619277 +0000 UTC m=+1054.012137141" lastFinishedPulling="2026-02-19 21:45:58.248350046 +0000 UTC m=+1069.440867910" observedRunningTime="2026-02-19 21:45:59.514193806 +0000 UTC m=+1070.706711670" watchObservedRunningTime="2026-02-19 21:45:59.514709771 +0000 UTC m=+1070.707227635" Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.596929 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-8jt8c" podStartSLOduration=1.5969009889999999 podStartE2EDuration="1.596900989s" podCreationTimestamp="2026-02-19 21:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:59.591861776 +0000 UTC m=+1070.784379640" watchObservedRunningTime="2026-02-19 21:45:59.596900989 +0000 UTC m=+1070.789418853" Feb 19 21:45:59 crc kubenswrapper[4795]: I0219 21:45:59.635047 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-768666cd57-5c55b" podStartSLOduration=11.635026294 podStartE2EDuration="11.635026294s" podCreationTimestamp="2026-02-19 21:45:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:45:59.630480386 +0000 UTC m=+1070.822998250" watchObservedRunningTime="2026-02-19 21:45:59.635026294 +0000 UTC m=+1070.827544158" Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.516792 4795 generic.go:334] "Generic (PLEG): container finished" podID="6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c" containerID="83550b0e412e3c0f2147b2389e1c3220efde13a4057f9e67e612e0461d3172fc" exitCode=0 Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.516871 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8jt8c" event={"ID":"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c","Type":"ContainerDied","Data":"83550b0e412e3c0f2147b2389e1c3220efde13a4057f9e67e612e0461d3172fc"} Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.522134 4795 generic.go:334] "Generic (PLEG): container finished" podID="57961551-d4f8-4586-b255-8810fbdb499a" containerID="122378ef931bbdce5fbd1c31b593170eeae5e5fed8a74b40fbb0d536de3a3d22" exitCode=0 Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.522199 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d602-account-create-update-mc6fv" event={"ID":"57961551-d4f8-4586-b255-8810fbdb499a","Type":"ContainerDied","Data":"122378ef931bbdce5fbd1c31b593170eeae5e5fed8a74b40fbb0d536de3a3d22"} Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.535420 4795 generic.go:334] "Generic (PLEG): container finished" podID="4cb700b2-4c29-4deb-a379-d18f2695dcaf" containerID="88bfd25384ba94c2658b1d015dafd73e32d146937b8caf9b1891261850b11f22" exitCode=0 Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.535713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c93-account-create-update-ptrqq" event={"ID":"4cb700b2-4c29-4deb-a379-d18f2695dcaf","Type":"ContainerDied","Data":"88bfd25384ba94c2658b1d015dafd73e32d146937b8caf9b1891261850b11f22"} Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.968449 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vlmnn" Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.969862 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:46:00 crc kubenswrapper[4795]: I0219 21:46:00.975250 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-snb69" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.081248 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d152069-2c3d-4cf4-94e8-3068e24def9f-operator-scripts\") pod \"7d152069-2c3d-4cf4-94e8-3068e24def9f\" (UID: \"7d152069-2c3d-4cf4-94e8-3068e24def9f\") " Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.081299 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88vvr\" (UniqueName: \"kubernetes.io/projected/18561896-d336-4962-8e9e-4ccf748f8605-kube-api-access-88vvr\") pod \"18561896-d336-4962-8e9e-4ccf748f8605\" (UID: \"18561896-d336-4962-8e9e-4ccf748f8605\") " Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.081332 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hxjx\" (UniqueName: \"kubernetes.io/projected/73f01f44-1467-442f-b91f-ac1765626a3d-kube-api-access-9hxjx\") pod \"73f01f44-1467-442f-b91f-ac1765626a3d\" (UID: \"73f01f44-1467-442f-b91f-ac1765626a3d\") " Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.081381 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73f01f44-1467-442f-b91f-ac1765626a3d-operator-scripts\") pod \"73f01f44-1467-442f-b91f-ac1765626a3d\" (UID: \"73f01f44-1467-442f-b91f-ac1765626a3d\") " Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.081494 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v72cs\" (UniqueName: \"kubernetes.io/projected/7d152069-2c3d-4cf4-94e8-3068e24def9f-kube-api-access-v72cs\") pod \"7d152069-2c3d-4cf4-94e8-3068e24def9f\" (UID: \"7d152069-2c3d-4cf4-94e8-3068e24def9f\") " Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.081521 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18561896-d336-4962-8e9e-4ccf748f8605-operator-scripts\") pod \"18561896-d336-4962-8e9e-4ccf748f8605\" (UID: \"18561896-d336-4962-8e9e-4ccf748f8605\") " Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.085820 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d152069-2c3d-4cf4-94e8-3068e24def9f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d152069-2c3d-4cf4-94e8-3068e24def9f" (UID: "7d152069-2c3d-4cf4-94e8-3068e24def9f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.086306 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f01f44-1467-442f-b91f-ac1765626a3d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73f01f44-1467-442f-b91f-ac1765626a3d" (UID: "73f01f44-1467-442f-b91f-ac1765626a3d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.087111 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18561896-d336-4962-8e9e-4ccf748f8605-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "18561896-d336-4962-8e9e-4ccf748f8605" (UID: "18561896-d336-4962-8e9e-4ccf748f8605"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.089020 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d152069-2c3d-4cf4-94e8-3068e24def9f-kube-api-access-v72cs" (OuterVolumeSpecName: "kube-api-access-v72cs") pod "7d152069-2c3d-4cf4-94e8-3068e24def9f" (UID: "7d152069-2c3d-4cf4-94e8-3068e24def9f"). InnerVolumeSpecName "kube-api-access-v72cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.089328 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18561896-d336-4962-8e9e-4ccf748f8605-kube-api-access-88vvr" (OuterVolumeSpecName: "kube-api-access-88vvr") pod "18561896-d336-4962-8e9e-4ccf748f8605" (UID: "18561896-d336-4962-8e9e-4ccf748f8605"). InnerVolumeSpecName "kube-api-access-88vvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.091356 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f01f44-1467-442f-b91f-ac1765626a3d-kube-api-access-9hxjx" (OuterVolumeSpecName: "kube-api-access-9hxjx") pod "73f01f44-1467-442f-b91f-ac1765626a3d" (UID: "73f01f44-1467-442f-b91f-ac1765626a3d"). InnerVolumeSpecName "kube-api-access-9hxjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.182849 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v72cs\" (UniqueName: \"kubernetes.io/projected/7d152069-2c3d-4cf4-94e8-3068e24def9f-kube-api-access-v72cs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.183150 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18561896-d336-4962-8e9e-4ccf748f8605-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.183160 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d152069-2c3d-4cf4-94e8-3068e24def9f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.183180 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88vvr\" (UniqueName: \"kubernetes.io/projected/18561896-d336-4962-8e9e-4ccf748f8605-kube-api-access-88vvr\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.183189 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hxjx\" (UniqueName: \"kubernetes.io/projected/73f01f44-1467-442f-b91f-ac1765626a3d-kube-api-access-9hxjx\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.183199 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73f01f44-1467-442f-b91f-ac1765626a3d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.550434 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f51-account-create-update-n57zq" event={"ID":"7d152069-2c3d-4cf4-94e8-3068e24def9f","Type":"ContainerDied","Data":"a0c267ff5a8f3bc00149f43c149c6b31e9b75bde6755d31bd67947fa2475dd75"} Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.550480 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0c267ff5a8f3bc00149f43c149c6b31e9b75bde6755d31bd67947fa2475dd75" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.550556 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f51-account-create-update-n57zq" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.553251 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-snb69" event={"ID":"18561896-d336-4962-8e9e-4ccf748f8605","Type":"ContainerDied","Data":"237556236a6852992de9809aa75761000b557b98fe94e76438ed8fb257b1f0d3"} Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.553279 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="237556236a6852992de9809aa75761000b557b98fe94e76438ed8fb257b1f0d3" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.553327 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-snb69" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.555592 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vlmnn" Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.556233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vlmnn" event={"ID":"73f01f44-1467-442f-b91f-ac1765626a3d","Type":"ContainerDied","Data":"0abef5e7947f0e3cbddf50027fa385aebc0f4b6ccd2ad482c073299aac2c5e2e"} Feb 19 21:46:01 crc kubenswrapper[4795]: I0219 21:46:01.556262 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0abef5e7947f0e3cbddf50027fa385aebc0f4b6ccd2ad482c073299aac2c5e2e" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.736803 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.763650 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.771843 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8jt8c" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.830348 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb700b2-4c29-4deb-a379-d18f2695dcaf-operator-scripts\") pod \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\" (UID: \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\") " Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.832947 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb700b2-4c29-4deb-a379-d18f2695dcaf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4cb700b2-4c29-4deb-a379-d18f2695dcaf" (UID: "4cb700b2-4c29-4deb-a379-d18f2695dcaf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.833640 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfjgm\" (UniqueName: \"kubernetes.io/projected/4cb700b2-4c29-4deb-a379-d18f2695dcaf-kube-api-access-rfjgm\") pod \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\" (UID: \"4cb700b2-4c29-4deb-a379-d18f2695dcaf\") " Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.835072 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scxqb\" (UniqueName: \"kubernetes.io/projected/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-kube-api-access-scxqb\") pod \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\" (UID: \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\") " Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.836098 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-operator-scripts\") pod \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\" (UID: \"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c\") " Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.836496 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57961551-d4f8-4586-b255-8810fbdb499a-operator-scripts\") pod \"57961551-d4f8-4586-b255-8810fbdb499a\" (UID: \"57961551-d4f8-4586-b255-8810fbdb499a\") " Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.836660 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kkx2\" (UniqueName: \"kubernetes.io/projected/57961551-d4f8-4586-b255-8810fbdb499a-kube-api-access-4kkx2\") pod \"57961551-d4f8-4586-b255-8810fbdb499a\" (UID: \"57961551-d4f8-4586-b255-8810fbdb499a\") " Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.836772 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c" (UID: "6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.837095 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57961551-d4f8-4586-b255-8810fbdb499a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "57961551-d4f8-4586-b255-8810fbdb499a" (UID: "57961551-d4f8-4586-b255-8810fbdb499a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.838416 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb700b2-4c29-4deb-a379-d18f2695dcaf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.838439 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.838450 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57961551-d4f8-4586-b255-8810fbdb499a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.839672 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cb700b2-4c29-4deb-a379-d18f2695dcaf-kube-api-access-rfjgm" (OuterVolumeSpecName: "kube-api-access-rfjgm") pod "4cb700b2-4c29-4deb-a379-d18f2695dcaf" (UID: "4cb700b2-4c29-4deb-a379-d18f2695dcaf"). InnerVolumeSpecName "kube-api-access-rfjgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.840561 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-kube-api-access-scxqb" (OuterVolumeSpecName: "kube-api-access-scxqb") pod "6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c" (UID: "6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c"). InnerVolumeSpecName "kube-api-access-scxqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.840969 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57961551-d4f8-4586-b255-8810fbdb499a-kube-api-access-4kkx2" (OuterVolumeSpecName: "kube-api-access-4kkx2") pod "57961551-d4f8-4586-b255-8810fbdb499a" (UID: "57961551-d4f8-4586-b255-8810fbdb499a"). InnerVolumeSpecName "kube-api-access-4kkx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.940387 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfjgm\" (UniqueName: \"kubernetes.io/projected/4cb700b2-4c29-4deb-a379-d18f2695dcaf-kube-api-access-rfjgm\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.940417 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scxqb\" (UniqueName: \"kubernetes.io/projected/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c-kube-api-access-scxqb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:03 crc kubenswrapper[4795]: I0219 21:46:03.940426 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kkx2\" (UniqueName: \"kubernetes.io/projected/57961551-d4f8-4586-b255-8810fbdb499a-kube-api-access-4kkx2\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.039418 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.118661 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-hpfn4"] Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.118932 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" podUID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" containerName="dnsmasq-dns" containerID="cri-o://5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86" gracePeriod=10 Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.473782 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.549416 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-nb\") pod \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.549491 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-sb\") pod \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.549559 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-dns-svc\") pod \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.549604 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpq8v\" (UniqueName: \"kubernetes.io/projected/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-kube-api-access-cpq8v\") pod \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.549640 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-config\") pod \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\" (UID: \"7b4dab90-2dba-4e1f-95fe-5c435d4e270a\") " Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.554681 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-kube-api-access-cpq8v" (OuterVolumeSpecName: "kube-api-access-cpq8v") pod "7b4dab90-2dba-4e1f-95fe-5c435d4e270a" (UID: "7b4dab90-2dba-4e1f-95fe-5c435d4e270a"). InnerVolumeSpecName "kube-api-access-cpq8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.583212 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8jt8c" event={"ID":"6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c","Type":"ContainerDied","Data":"6e153c4a49c239a97a8eb900ea2dbb494088e83e1efd11bec5254c0600bc3983"} Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.583256 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e153c4a49c239a97a8eb900ea2dbb494088e83e1efd11bec5254c0600bc3983" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.583339 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8jt8c" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.585745 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dxql7" event={"ID":"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23","Type":"ContainerStarted","Data":"088f63c71340a869532178f76fc11ab9c0cbbdc5c1aeacd6e851cf5d40a46fa0"} Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.593700 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d602-account-create-update-mc6fv" event={"ID":"57961551-d4f8-4586-b255-8810fbdb499a","Type":"ContainerDied","Data":"22d7d81ad8b9d1148d11d4477982eeefb07c589c029993c9da669c9f0edf5d61"} Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.593743 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22d7d81ad8b9d1148d11d4477982eeefb07c589c029993c9da669c9f0edf5d61" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.593834 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d602-account-create-update-mc6fv" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.595659 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7b4dab90-2dba-4e1f-95fe-5c435d4e270a" (UID: "7b4dab90-2dba-4e1f-95fe-5c435d4e270a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.600325 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c93-account-create-update-ptrqq" event={"ID":"4cb700b2-4c29-4deb-a379-d18f2695dcaf","Type":"ContainerDied","Data":"3989bba1eed50aec45faea3f156750277605314604317a5496d6bb8b3902c171"} Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.600367 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3989bba1eed50aec45faea3f156750277605314604317a5496d6bb8b3902c171" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.600452 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c93-account-create-update-ptrqq" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.607185 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" containerID="5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86" exitCode=0 Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.607238 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" event={"ID":"7b4dab90-2dba-4e1f-95fe-5c435d4e270a","Type":"ContainerDied","Data":"5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86"} Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.607278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" event={"ID":"7b4dab90-2dba-4e1f-95fe-5c435d4e270a","Type":"ContainerDied","Data":"6a5cc19b4b04424e6441590aca0708de64c3cb94b9b2a21745480c88aa7a5c4f"} Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.607307 4795 scope.go:117] "RemoveContainer" containerID="5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.608282 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-hpfn4" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.617793 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-dxql7" podStartSLOduration=2.255890913 podStartE2EDuration="6.61777349s" podCreationTimestamp="2026-02-19 21:45:58 +0000 UTC" firstStartedPulling="2026-02-19 21:45:59.270295187 +0000 UTC m=+1070.462813051" lastFinishedPulling="2026-02-19 21:46:03.632177754 +0000 UTC m=+1074.824695628" observedRunningTime="2026-02-19 21:46:04.604747742 +0000 UTC m=+1075.797265596" watchObservedRunningTime="2026-02-19 21:46:04.61777349 +0000 UTC m=+1075.810291354" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.620616 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-config" (OuterVolumeSpecName: "config") pod "7b4dab90-2dba-4e1f-95fe-5c435d4e270a" (UID: "7b4dab90-2dba-4e1f-95fe-5c435d4e270a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.622078 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7b4dab90-2dba-4e1f-95fe-5c435d4e270a" (UID: "7b4dab90-2dba-4e1f-95fe-5c435d4e270a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.628653 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7b4dab90-2dba-4e1f-95fe-5c435d4e270a" (UID: "7b4dab90-2dba-4e1f-95fe-5c435d4e270a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.652387 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.652424 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.652438 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.652449 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpq8v\" (UniqueName: \"kubernetes.io/projected/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-kube-api-access-cpq8v\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.652462 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4dab90-2dba-4e1f-95fe-5c435d4e270a-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.692938 4795 scope.go:117] "RemoveContainer" containerID="c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.720048 4795 scope.go:117] "RemoveContainer" containerID="5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86" Feb 19 21:46:04 crc kubenswrapper[4795]: E0219 21:46:04.720577 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86\": container with ID starting with 5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86 not found: ID does not exist" containerID="5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.720617 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86"} err="failed to get container status \"5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86\": rpc error: code = NotFound desc = could not find container \"5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86\": container with ID starting with 5482b8f1b44d8eeecca016192cda7d894246fb50c4d06a9b45628279359dfa86 not found: ID does not exist" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.720644 4795 scope.go:117] "RemoveContainer" containerID="c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced" Feb 19 21:46:04 crc kubenswrapper[4795]: E0219 21:46:04.720991 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced\": container with ID starting with c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced not found: ID does not exist" containerID="c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.721014 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced"} err="failed to get container status \"c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced\": rpc error: code = NotFound desc = could not find container \"c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced\": container with ID starting with c1eba6aa8c078daa91e1cfd1acb049a25d5fbe528714a75b4a04c5fe223c7ced not found: ID does not exist" Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.945082 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-hpfn4"] Feb 19 21:46:04 crc kubenswrapper[4795]: I0219 21:46:04.952056 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-hpfn4"] Feb 19 21:46:05 crc kubenswrapper[4795]: I0219 21:46:05.529258 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" path="/var/lib/kubelet/pods/7b4dab90-2dba-4e1f-95fe-5c435d4e270a/volumes" Feb 19 21:46:05 crc kubenswrapper[4795]: I0219 21:46:05.626040 4795 generic.go:334] "Generic (PLEG): container finished" podID="ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" containerID="f3e360ac25dc639935c61b2baa9937c7f07b88ebec95aede182ab25a4907955c" exitCode=0 Feb 19 21:46:05 crc kubenswrapper[4795]: I0219 21:46:05.626115 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2wbff" event={"ID":"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8","Type":"ContainerDied","Data":"f3e360ac25dc639935c61b2baa9937c7f07b88ebec95aede182ab25a4907955c"} Feb 19 21:46:06 crc kubenswrapper[4795]: I0219 21:46:06.638483 4795 generic.go:334] "Generic (PLEG): container finished" podID="cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23" containerID="088f63c71340a869532178f76fc11ab9c0cbbdc5c1aeacd6e851cf5d40a46fa0" exitCode=0 Feb 19 21:46:06 crc kubenswrapper[4795]: I0219 21:46:06.638574 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dxql7" event={"ID":"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23","Type":"ContainerDied","Data":"088f63c71340a869532178f76fc11ab9c0cbbdc5c1aeacd6e851cf5d40a46fa0"} Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.147523 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2wbff" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.197866 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q9z2\" (UniqueName: \"kubernetes.io/projected/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-kube-api-access-6q9z2\") pod \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.197981 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-db-sync-config-data\") pod \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.198018 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-config-data\") pod \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.198109 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-combined-ca-bundle\") pod \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\" (UID: \"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8\") " Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.203684 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-kube-api-access-6q9z2" (OuterVolumeSpecName: "kube-api-access-6q9z2") pod "ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" (UID: "ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8"). InnerVolumeSpecName "kube-api-access-6q9z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.203741 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" (UID: "ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.221969 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" (UID: "ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.240814 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-config-data" (OuterVolumeSpecName: "config-data") pod "ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" (UID: "ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.300918 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q9z2\" (UniqueName: \"kubernetes.io/projected/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-kube-api-access-6q9z2\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.300975 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.300984 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.300994 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.648404 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2wbff" event={"ID":"ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8","Type":"ContainerDied","Data":"256eb7cf5acf7b851870ab83b4723dabaee1fbaa60ee04f8ea61103c387af2ad"} Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.648814 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="256eb7cf5acf7b851870ab83b4723dabaee1fbaa60ee04f8ea61103c387af2ad" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.648437 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2wbff" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.902673 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dxql7" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.912067 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf27b\" (UniqueName: \"kubernetes.io/projected/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-kube-api-access-vf27b\") pod \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.912250 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-combined-ca-bundle\") pod \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.912325 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-config-data\") pod \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\" (UID: \"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23\") " Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.927857 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-kube-api-access-vf27b" (OuterVolumeSpecName: "kube-api-access-vf27b") pod "cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23" (UID: "cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23"). InnerVolumeSpecName "kube-api-access-vf27b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.956487 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23" (UID: "cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:07 crc kubenswrapper[4795]: I0219 21:46:07.989006 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-config-data" (OuterVolumeSpecName: "config-data") pod "cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23" (UID: "cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.014454 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.014487 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.014499 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf27b\" (UniqueName: \"kubernetes.io/projected/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23-kube-api-access-vf27b\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016049 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-jp7k7"] Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016497 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb700b2-4c29-4deb-a379-d18f2695dcaf" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016520 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb700b2-4c29-4deb-a379-d18f2695dcaf" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016541 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016550 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016559 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" containerName="glance-db-sync" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016568 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" containerName="glance-db-sync" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016582 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18561896-d336-4962-8e9e-4ccf748f8605" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016588 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="18561896-d336-4962-8e9e-4ccf748f8605" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016602 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" containerName="init" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016609 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" containerName="init" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016628 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d152069-2c3d-4cf4-94e8-3068e24def9f" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016636 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d152069-2c3d-4cf4-94e8-3068e24def9f" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016649 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23" containerName="keystone-db-sync" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016655 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23" containerName="keystone-db-sync" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016664 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57961551-d4f8-4586-b255-8810fbdb499a" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016672 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="57961551-d4f8-4586-b255-8810fbdb499a" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016693 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" containerName="dnsmasq-dns" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016703 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" containerName="dnsmasq-dns" Feb 19 21:46:08 crc kubenswrapper[4795]: E0219 21:46:08.016715 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f01f44-1467-442f-b91f-ac1765626a3d" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016723 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f01f44-1467-442f-b91f-ac1765626a3d" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016896 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b4dab90-2dba-4e1f-95fe-5c435d4e270a" containerName="dnsmasq-dns" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016909 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="73f01f44-1467-442f-b91f-ac1765626a3d" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016919 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" containerName="glance-db-sync" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016925 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb700b2-4c29-4deb-a379-d18f2695dcaf" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016937 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016947 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d152069-2c3d-4cf4-94e8-3068e24def9f" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016955 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="18561896-d336-4962-8e9e-4ccf748f8605" containerName="mariadb-database-create" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016966 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23" containerName="keystone-db-sync" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.016982 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="57961551-d4f8-4586-b255-8810fbdb499a" containerName="mariadb-account-create-update" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.017857 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.031080 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-jp7k7"] Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.115724 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-nb\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.115826 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-swift-storage-0\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.115888 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-svc\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.115920 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crvcs\" (UniqueName: \"kubernetes.io/projected/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-kube-api-access-crvcs\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.116003 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-sb\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.116046 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-config\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.217758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-svc\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.217801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crvcs\" (UniqueName: \"kubernetes.io/projected/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-kube-api-access-crvcs\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.217832 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-sb\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.217862 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-config\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.217908 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-nb\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.217943 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-swift-storage-0\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.218845 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-sb\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.218888 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-config\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.218922 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-svc\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.219082 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-swift-storage-0\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.219117 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-nb\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.238535 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crvcs\" (UniqueName: \"kubernetes.io/projected/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-kube-api-access-crvcs\") pod \"dnsmasq-dns-68677f88c9-jp7k7\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.431584 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.657338 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dxql7" event={"ID":"cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23","Type":"ContainerDied","Data":"884276ae22779ec15ffcd93cb5953527ba29f916cfb173ab2b7943b9d65999d4"} Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.657943 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="884276ae22779ec15ffcd93cb5953527ba29f916cfb173ab2b7943b9d65999d4" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.657733 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dxql7" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.853864 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-jp7k7"] Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.881644 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-sh4j9"] Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.893222 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.904019 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-sh4j9"] Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.920682 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-jp7k7"] Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.929424 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xp5gn"] Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.930563 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.932109 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-swift-storage-0\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.932157 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-nb\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.932212 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-sb\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.932316 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-config\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.932378 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-svc\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.932525 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjfkf\" (UniqueName: \"kubernetes.io/projected/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-kube-api-access-bjfkf\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.932919 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.933203 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.933379 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.933601 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.933806 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8pz2x" Feb 19 21:46:08 crc kubenswrapper[4795]: I0219 21:46:08.943003 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xp5gn"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036302 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-swift-storage-0\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036365 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-combined-ca-bundle\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-nb\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036408 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-credential-keys\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036437 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-fernet-keys\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036460 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-sb\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036497 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-config\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-svc\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036538 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-scripts\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036569 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkx2w\" (UniqueName: \"kubernetes.io/projected/60623dff-0241-4bc9-8b17-de61d7271e19-kube-api-access-kkx2w\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-config-data\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.036606 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjfkf\" (UniqueName: \"kubernetes.io/projected/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-kube-api-access-bjfkf\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.037777 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-swift-storage-0\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.038055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-nb\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.038363 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-config\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.038889 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-sb\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.039009 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-svc\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.092897 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-zjbsw"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.094077 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.096283 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjfkf\" (UniqueName: \"kubernetes.io/projected/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-kube-api-access-bjfkf\") pod \"dnsmasq-dns-7d67cdfc8f-sh4j9\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.107958 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.108227 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wf9zm" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.108728 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138071 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-scripts\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138181 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-combined-ca-bundle\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138213 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-db-sync-config-data\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138244 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-credential-keys\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbqsc\" (UniqueName: \"kubernetes.io/projected/5084e7b9-4923-449e-b0d7-28c602faeff0-kube-api-access-qbqsc\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138315 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-fernet-keys\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138337 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5084e7b9-4923-449e-b0d7-28c602faeff0-etc-machine-id\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-combined-ca-bundle\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138451 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-scripts\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138486 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-config-data\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkx2w\" (UniqueName: \"kubernetes.io/projected/60623dff-0241-4bc9-8b17-de61d7271e19-kube-api-access-kkx2w\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.138542 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-config-data\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.142004 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-fernet-keys\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.144227 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zjbsw"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.149616 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-combined-ca-bundle\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.150013 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-scripts\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.155994 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-config-data\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.160563 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-credential-keys\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.182904 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.197222 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.200557 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.200752 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.233640 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250602 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-combined-ca-bundle\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250656 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-config-data\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250673 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-run-httpd\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250699 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-scripts\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250720 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-config-data\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250737 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250775 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drpsg\" (UniqueName: \"kubernetes.io/projected/f6698443-b029-4098-81d6-dba6d5f239f2-kube-api-access-drpsg\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-scripts\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250835 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-log-httpd\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250863 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-db-sync-config-data\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250885 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250908 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbqsc\" (UniqueName: \"kubernetes.io/projected/5084e7b9-4923-449e-b0d7-28c602faeff0-kube-api-access-qbqsc\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.250924 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5084e7b9-4923-449e-b0d7-28c602faeff0-etc-machine-id\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.267812 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkx2w\" (UniqueName: \"kubernetes.io/projected/60623dff-0241-4bc9-8b17-de61d7271e19-kube-api-access-kkx2w\") pod \"keystone-bootstrap-xp5gn\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.272111 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-scripts\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.272563 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5084e7b9-4923-449e-b0d7-28c602faeff0-etc-machine-id\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.279727 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.282310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-config-data\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.310729 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.335869 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbqsc\" (UniqueName: \"kubernetes.io/projected/5084e7b9-4923-449e-b0d7-28c602faeff0-kube-api-access-qbqsc\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.339634 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-db-sync-config-data\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.343494 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-combined-ca-bundle\") pod \"cinder-db-sync-zjbsw\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.355128 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-b4bcd"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.356829 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.363965 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.364590 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.376759 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-config-data\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.376871 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-run-httpd\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.376996 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-scripts\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.377488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.377630 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drpsg\" (UniqueName: \"kubernetes.io/projected/f6698443-b029-4098-81d6-dba6d5f239f2-kube-api-access-drpsg\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.377762 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-log-httpd\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.378931 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-log-httpd\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.379301 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nf9cn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.379909 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-run-httpd\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.384653 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.414343 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.415314 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-scripts\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.440564 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.441117 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drpsg\" (UniqueName: \"kubernetes.io/projected/f6698443-b029-4098-81d6-dba6d5f239f2-kube-api-access-drpsg\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.450859 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.498435 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-config-data\") pod \"ceilometer-0\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.516438 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jkspq"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.517641 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.531503 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-combined-ca-bundle\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.531666 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-config\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.531689 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6xqk\" (UniqueName: \"kubernetes.io/projected/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-kube-api-access-c6xqk\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.533875 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-bkmsl" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.534012 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.539534 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.600464 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-ttz5x"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.602346 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.605930 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xv49j" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.606121 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.628034 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ttz5x"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.632953 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-combined-ca-bundle\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.632994 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-config-data\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633051 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7sch\" (UniqueName: \"kubernetes.io/projected/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-kube-api-access-k7sch\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633090 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-combined-ca-bundle\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633126 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-combined-ca-bundle\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633149 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-db-sync-config-data\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633187 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/454af6b2-4c9e-4706-a537-b3e3d468353d-logs\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633205 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-scripts\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633222 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-config\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6xqk\" (UniqueName: \"kubernetes.io/projected/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-kube-api-access-c6xqk\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.633409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppjmb\" (UniqueName: \"kubernetes.io/projected/454af6b2-4c9e-4706-a537-b3e3d468353d-kube-api-access-ppjmb\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.641792 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.645389 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-combined-ca-bundle\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.646995 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-config\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.649122 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-b4bcd"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.649337 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6xqk\" (UniqueName: \"kubernetes.io/projected/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-kube-api-access-c6xqk\") pod \"neutron-db-sync-b4bcd\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.699231 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jkspq"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.700559 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" event={"ID":"8df3e76b-9aa2-476b-aa19-62518a8ddd1e","Type":"ContainerStarted","Data":"d98c932d8b2fc29804500d56b2954097b63f156f2f810e2111cc071a2a6acce2"} Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.720588 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-sh4j9"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.734836 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-combined-ca-bundle\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.734881 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-db-sync-config-data\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.734933 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/454af6b2-4c9e-4706-a537-b3e3d468353d-logs\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.734954 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-scripts\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.735004 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppjmb\" (UniqueName: \"kubernetes.io/projected/454af6b2-4c9e-4706-a537-b3e3d468353d-kube-api-access-ppjmb\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.735064 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-config-data\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.735104 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7sch\" (UniqueName: \"kubernetes.io/projected/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-kube-api-access-k7sch\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.735159 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-combined-ca-bundle\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.735382 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/454af6b2-4c9e-4706-a537-b3e3d468353d-logs\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.742376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-scripts\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.742544 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-db-sync-config-data\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.743432 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-config-data\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.743798 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-combined-ca-bundle\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.750156 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.750447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-combined-ca-bundle\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.751457 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7sch\" (UniqueName: \"kubernetes.io/projected/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-kube-api-access-k7sch\") pod \"barbican-db-sync-ttz5x\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.753650 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppjmb\" (UniqueName: \"kubernetes.io/projected/454af6b2-4c9e-4706-a537-b3e3d468353d-kube-api-access-ppjmb\") pod \"placement-db-sync-jkspq\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.758505 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67dccc895-rxl4z"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.763590 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.765663 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-rxl4z"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.814776 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nf9cn" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.823271 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.836872 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-nb\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.836918 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfv2b\" (UniqueName: \"kubernetes.io/projected/db8db625-527b-49de-bab0-c2065360d792-kube-api-access-jfv2b\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.836934 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-sb\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.836951 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-swift-storage-0\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.836967 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-config\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.837044 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-svc\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.922755 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.940268 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-svc\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.940376 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-nb\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.940403 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfv2b\" (UniqueName: \"kubernetes.io/projected/db8db625-527b-49de-bab0-c2065360d792-kube-api-access-jfv2b\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.940418 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-sb\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.940437 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-swift-storage-0\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.940452 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-config\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.941694 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-svc\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.943142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-nb\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.945085 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-sb\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.946292 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-swift-storage-0\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.947926 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-config\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.955332 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.974588 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xp5gn"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.985586 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfv2b\" (UniqueName: \"kubernetes.io/projected/db8db625-527b-49de-bab0-c2065360d792-kube-api-access-jfv2b\") pod \"dnsmasq-dns-67dccc895-rxl4z\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.989219 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.991818 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:09 crc kubenswrapper[4795]: I0219 21:46:09.996701 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:09.999026 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6rhfn" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.003526 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.010838 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.047685 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9x6w\" (UniqueName: \"kubernetes.io/projected/5f9a41f3-9dae-4426-8687-368f5911a834-kube-api-access-r9x6w\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.047752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.047777 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.047805 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.047836 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.047915 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.048248 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: W0219 21:46:10.053903 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60623dff_0241_4bc9_8b17_de61d7271e19.slice/crio-5919e2bd3e6456e8a3e11cb00ab17617e176207d63da18ffb2be8092647e71a8 WatchSource:0}: Error finding container 5919e2bd3e6456e8a3e11cb00ab17617e176207d63da18ffb2be8092647e71a8: Status 404 returned error can't find the container with id 5919e2bd3e6456e8a3e11cb00ab17617e176207d63da18ffb2be8092647e71a8 Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.086878 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.096690 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.109410 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.110755 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.125765 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.136725 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.183850 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.187852 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.187939 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.187976 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.188017 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9x6w\" (UniqueName: \"kubernetes.io/projected/5f9a41f3-9dae-4426-8687-368f5911a834-kube-api-access-r9x6w\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.188069 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.188092 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.188181 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: W0219 21:46:10.192863 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5084e7b9_4923_449e_b0d7_28c602faeff0.slice/crio-afea16e4a0f822c8b9769634b3e5dede9add2d921ddeb3308c942ab11c4579f9 WatchSource:0}: Error finding container afea16e4a0f822c8b9769634b3e5dede9add2d921ddeb3308c942ab11c4579f9: Status 404 returned error can't find the container with id afea16e4a0f822c8b9769634b3e5dede9add2d921ddeb3308c942ab11c4579f9 Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.193901 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.194228 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.194966 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.195786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.196465 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.208719 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.227648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9x6w\" (UniqueName: \"kubernetes.io/projected/5f9a41f3-9dae-4426-8687-368f5911a834-kube-api-access-r9x6w\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.262054 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.268911 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-sh4j9"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.290757 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.290838 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-scripts\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.290887 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.291412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-config-data\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.291458 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2rcc\" (UniqueName: \"kubernetes.io/projected/2b668591-0c11-42f0-b813-94b76c8cbd1b-kube-api-access-v2rcc\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.291634 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.291668 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-logs\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.308070 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zjbsw"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.315960 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.325475 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-b4bcd"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.393357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-config-data\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.400119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2rcc\" (UniqueName: \"kubernetes.io/projected/2b668591-0c11-42f0-b813-94b76c8cbd1b-kube-api-access-v2rcc\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.400294 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.400369 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-logs\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.400502 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.400664 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-scripts\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.400755 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.401036 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.401061 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-logs\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.401302 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.413308 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-scripts\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.415835 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.420272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-config-data\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.422903 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2rcc\" (UniqueName: \"kubernetes.io/projected/2b668591-0c11-42f0-b813-94b76c8cbd1b-kube-api-access-v2rcc\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.447697 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.641610 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ttz5x"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.649330 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jkspq"] Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.664313 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-rxl4z"] Feb 19 21:46:10 crc kubenswrapper[4795]: W0219 21:46:10.671255 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb8db625_527b_49de_bab0_c2065360d792.slice/crio-4fa1b7dc967ebd0720fc00eaa4e4599d7f0abf1d8d22f716edb0032cd429d0aa WatchSource:0}: Error finding container 4fa1b7dc967ebd0720fc00eaa4e4599d7f0abf1d8d22f716edb0032cd429d0aa: Status 404 returned error can't find the container with id 4fa1b7dc967ebd0720fc00eaa4e4599d7f0abf1d8d22f716edb0032cd429d0aa Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.715188 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xp5gn" event={"ID":"60623dff-0241-4bc9-8b17-de61d7271e19","Type":"ContainerStarted","Data":"9ae6b68ead4d625967dc9f8e67a83bcadf072cd6d6b5d617f89d10bb543678b8"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.715252 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xp5gn" event={"ID":"60623dff-0241-4bc9-8b17-de61d7271e19","Type":"ContainerStarted","Data":"5919e2bd3e6456e8a3e11cb00ab17617e176207d63da18ffb2be8092647e71a8"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.719013 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.719797 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerStarted","Data":"333abcac5e9ec8b6d96f2784182bddc2611944e9296eb36664d925e9b90b96b2"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.721090 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jkspq" event={"ID":"454af6b2-4c9e-4706-a537-b3e3d468353d","Type":"ContainerStarted","Data":"2f56b820ecab38a5f4f0b9d439b2a430292837a387794fcf0430bf0243ab99f1"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.721855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zjbsw" event={"ID":"5084e7b9-4923-449e-b0d7-28c602faeff0","Type":"ContainerStarted","Data":"afea16e4a0f822c8b9769634b3e5dede9add2d921ddeb3308c942ab11c4579f9"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.723219 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ttz5x" event={"ID":"e61f40e0-d6c3-49f7-a93f-d9956f086d4b","Type":"ContainerStarted","Data":"e8b62c3b7599823f742eaf7d07a077b7a0c11e4d7c8b7810bc63fd0eb218d413"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.725089 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b4bcd" event={"ID":"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec","Type":"ContainerStarted","Data":"b0effc166b07beb1020b33c2901a97075e3341db6cb3398e72144f393ccb6850"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.725129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b4bcd" event={"ID":"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec","Type":"ContainerStarted","Data":"9f412f3971266eb4c52987ab5a1d19a79a80d58662b3e55c9fe1df068708db16"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.728708 4795 generic.go:334] "Generic (PLEG): container finished" podID="8df3e76b-9aa2-476b-aa19-62518a8ddd1e" containerID="0896c578e55d0f5168d156a0553e2f4852050668424c462319ac2e65528da7b8" exitCode=0 Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.728785 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" event={"ID":"8df3e76b-9aa2-476b-aa19-62518a8ddd1e","Type":"ContainerDied","Data":"0896c578e55d0f5168d156a0553e2f4852050668424c462319ac2e65528da7b8"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.740933 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xp5gn" podStartSLOduration=2.740917479 podStartE2EDuration="2.740917479s" podCreationTimestamp="2026-02-19 21:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:10.737427801 +0000 UTC m=+1081.929945695" watchObservedRunningTime="2026-02-19 21:46:10.740917479 +0000 UTC m=+1081.933435343" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.744180 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" event={"ID":"db8db625-527b-49de-bab0-c2065360d792","Type":"ContainerStarted","Data":"4fa1b7dc967ebd0720fc00eaa4e4599d7f0abf1d8d22f716edb0032cd429d0aa"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.749610 4795 generic.go:334] "Generic (PLEG): container finished" podID="76a3a185-4911-4b2e-ab11-5ea1c61c2b69" containerID="729c91e32f5713fa403f44c9ceb7ddfc0360242d02f92dad958a357428c7d54a" exitCode=0 Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.749653 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" event={"ID":"76a3a185-4911-4b2e-ab11-5ea1c61c2b69","Type":"ContainerDied","Data":"729c91e32f5713fa403f44c9ceb7ddfc0360242d02f92dad958a357428c7d54a"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.749695 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" event={"ID":"76a3a185-4911-4b2e-ab11-5ea1c61c2b69","Type":"ContainerStarted","Data":"1deaacbccd11506cbf41e18d38101cecbd3bd4809459a1cb284d3391b2eee2e5"} Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.768292 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-b4bcd" podStartSLOduration=1.768269791 podStartE2EDuration="1.768269791s" podCreationTimestamp="2026-02-19 21:46:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:10.75688568 +0000 UTC m=+1081.949403544" watchObservedRunningTime="2026-02-19 21:46:10.768269791 +0000 UTC m=+1081.960787665" Feb 19 21:46:10 crc kubenswrapper[4795]: I0219 21:46:10.948312 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.301218 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.324044 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424065 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-swift-storage-0\") pod \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424135 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-nb\") pod \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424238 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-config\") pod \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424265 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crvcs\" (UniqueName: \"kubernetes.io/projected/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-kube-api-access-crvcs\") pod \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424296 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-config\") pod \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424345 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-swift-storage-0\") pod \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424369 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-sb\") pod \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424388 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-svc\") pod \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424408 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-svc\") pod \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424431 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-nb\") pod \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\" (UID: \"8df3e76b-9aa2-476b-aa19-62518a8ddd1e\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424446 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-sb\") pod \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.424482 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjfkf\" (UniqueName: \"kubernetes.io/projected/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-kube-api-access-bjfkf\") pod \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\" (UID: \"76a3a185-4911-4b2e-ab11-5ea1c61c2b69\") " Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.445924 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-kube-api-access-crvcs" (OuterVolumeSpecName: "kube-api-access-crvcs") pod "8df3e76b-9aa2-476b-aa19-62518a8ddd1e" (UID: "8df3e76b-9aa2-476b-aa19-62518a8ddd1e"). InnerVolumeSpecName "kube-api-access-crvcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.464003 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.466621 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-kube-api-access-bjfkf" (OuterVolumeSpecName: "kube-api-access-bjfkf") pod "76a3a185-4911-4b2e-ab11-5ea1c61c2b69" (UID: "76a3a185-4911-4b2e-ab11-5ea1c61c2b69"). InnerVolumeSpecName "kube-api-access-bjfkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: W0219 21:46:11.513644 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b668591_0c11_42f0_b813_94b76c8cbd1b.slice/crio-2d8f5079a7199ad590b8e7ae61509c061499525039c4df8317fdb2a9d4218e70 WatchSource:0}: Error finding container 2d8f5079a7199ad590b8e7ae61509c061499525039c4df8317fdb2a9d4218e70: Status 404 returned error can't find the container with id 2d8f5079a7199ad590b8e7ae61509c061499525039c4df8317fdb2a9d4218e70 Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.518533 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "76a3a185-4911-4b2e-ab11-5ea1c61c2b69" (UID: "76a3a185-4911-4b2e-ab11-5ea1c61c2b69"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.520700 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-config" (OuterVolumeSpecName: "config") pod "76a3a185-4911-4b2e-ab11-5ea1c61c2b69" (UID: "76a3a185-4911-4b2e-ab11-5ea1c61c2b69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.525843 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjfkf\" (UniqueName: \"kubernetes.io/projected/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-kube-api-access-bjfkf\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.525871 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.525881 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crvcs\" (UniqueName: \"kubernetes.io/projected/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-kube-api-access-crvcs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.525891 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.527434 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8df3e76b-9aa2-476b-aa19-62518a8ddd1e" (UID: "8df3e76b-9aa2-476b-aa19-62518a8ddd1e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.532080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8df3e76b-9aa2-476b-aa19-62518a8ddd1e" (UID: "8df3e76b-9aa2-476b-aa19-62518a8ddd1e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.539081 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-config" (OuterVolumeSpecName: "config") pod "8df3e76b-9aa2-476b-aa19-62518a8ddd1e" (UID: "8df3e76b-9aa2-476b-aa19-62518a8ddd1e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.540089 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "76a3a185-4911-4b2e-ab11-5ea1c61c2b69" (UID: "76a3a185-4911-4b2e-ab11-5ea1c61c2b69"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.541928 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "76a3a185-4911-4b2e-ab11-5ea1c61c2b69" (UID: "76a3a185-4911-4b2e-ab11-5ea1c61c2b69"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.546997 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8df3e76b-9aa2-476b-aa19-62518a8ddd1e" (UID: "8df3e76b-9aa2-476b-aa19-62518a8ddd1e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.548141 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8df3e76b-9aa2-476b-aa19-62518a8ddd1e" (UID: "8df3e76b-9aa2-476b-aa19-62518a8ddd1e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.560670 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "76a3a185-4911-4b2e-ab11-5ea1c61c2b69" (UID: "76a3a185-4911-4b2e-ab11-5ea1c61c2b69"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.628006 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.628545 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.628562 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.628756 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.629301 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.629460 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.629481 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76a3a185-4911-4b2e-ab11-5ea1c61c2b69-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.629532 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8df3e76b-9aa2-476b-aa19-62518a8ddd1e-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.762659 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" event={"ID":"8df3e76b-9aa2-476b-aa19-62518a8ddd1e","Type":"ContainerDied","Data":"d98c932d8b2fc29804500d56b2954097b63f156f2f810e2111cc071a2a6acce2"} Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.762679 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-jp7k7" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.762717 4795 scope.go:117] "RemoveContainer" containerID="0896c578e55d0f5168d156a0553e2f4852050668424c462319ac2e65528da7b8" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.765623 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f9a41f3-9dae-4426-8687-368f5911a834","Type":"ContainerStarted","Data":"74e861972e0b08341b7578c99f45877ca28dee930187dfc7c5d68702065ba963"} Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.766813 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b668591-0c11-42f0-b813-94b76c8cbd1b","Type":"ContainerStarted","Data":"2d8f5079a7199ad590b8e7ae61509c061499525039c4df8317fdb2a9d4218e70"} Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.768885 4795 generic.go:334] "Generic (PLEG): container finished" podID="db8db625-527b-49de-bab0-c2065360d792" containerID="3f9a72a27cc208f60203c5a7cd9bfc613d8657082c6f35b5044badc8bc0caace" exitCode=0 Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.769090 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" event={"ID":"db8db625-527b-49de-bab0-c2065360d792","Type":"ContainerDied","Data":"3f9a72a27cc208f60203c5a7cd9bfc613d8657082c6f35b5044badc8bc0caace"} Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.772709 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" event={"ID":"76a3a185-4911-4b2e-ab11-5ea1c61c2b69","Type":"ContainerDied","Data":"1deaacbccd11506cbf41e18d38101cecbd3bd4809459a1cb284d3391b2eee2e5"} Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.772776 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-sh4j9" Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.913527 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-jp7k7"] Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.956831 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-jp7k7"] Feb 19 21:46:11 crc kubenswrapper[4795]: I0219 21:46:11.987229 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-sh4j9"] Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:11.999477 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-sh4j9"] Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.007688 4795 scope.go:117] "RemoveContainer" containerID="729c91e32f5713fa403f44c9ceb7ddfc0360242d02f92dad958a357428c7d54a" Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.086462 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.124646 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.200603 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.800773 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f9a41f3-9dae-4426-8687-368f5911a834","Type":"ContainerStarted","Data":"cbf59bdd46660f6210995b0eb2fc2126976500e2a6c6cee2fb345b41a51900b2"} Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.804606 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b668591-0c11-42f0-b813-94b76c8cbd1b","Type":"ContainerStarted","Data":"f5cfd3f9869f87af702a2e0e738d0b4feb92ba480863b5a46d242043d700ac19"} Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.807066 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" event={"ID":"db8db625-527b-49de-bab0-c2065360d792","Type":"ContainerStarted","Data":"c943900c340586ccd6fb6dfea1e9575c93f20a12cfb4d8dd7d04eaf8b69bb4cb"} Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.807389 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:12 crc kubenswrapper[4795]: I0219 21:46:12.828219 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" podStartSLOduration=3.828205896 podStartE2EDuration="3.828205896s" podCreationTimestamp="2026-02-19 21:46:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:12.827585998 +0000 UTC m=+1084.020103862" watchObservedRunningTime="2026-02-19 21:46:12.828205896 +0000 UTC m=+1084.020723760" Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.520862 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76a3a185-4911-4b2e-ab11-5ea1c61c2b69" path="/var/lib/kubelet/pods/76a3a185-4911-4b2e-ab11-5ea1c61c2b69/volumes" Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.523550 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8df3e76b-9aa2-476b-aa19-62518a8ddd1e" path="/var/lib/kubelet/pods/8df3e76b-9aa2-476b-aa19-62518a8ddd1e/volumes" Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.818075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b668591-0c11-42f0-b813-94b76c8cbd1b","Type":"ContainerStarted","Data":"ecaa7d4a6273a3af65b1f5d9585a04b8dd375c28b94a4127bae9a591c4a91d3e"} Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.818249 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerName="glance-log" containerID="cri-o://f5cfd3f9869f87af702a2e0e738d0b4feb92ba480863b5a46d242043d700ac19" gracePeriod=30 Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.818341 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerName="glance-httpd" containerID="cri-o://ecaa7d4a6273a3af65b1f5d9585a04b8dd375c28b94a4127bae9a591c4a91d3e" gracePeriod=30 Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.825007 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f9a41f3-9dae-4426-8687-368f5911a834","Type":"ContainerStarted","Data":"a8c2432cd1a201f5714626d8c9b69ee61c26692ea73aa3955e3d0f4d2094d962"} Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.825137 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" containerName="glance-log" containerID="cri-o://cbf59bdd46660f6210995b0eb2fc2126976500e2a6c6cee2fb345b41a51900b2" gracePeriod=30 Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.825275 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" containerName="glance-httpd" containerID="cri-o://a8c2432cd1a201f5714626d8c9b69ee61c26692ea73aa3955e3d0f4d2094d962" gracePeriod=30 Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.841607 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.841588076 podStartE2EDuration="4.841588076s" podCreationTimestamp="2026-02-19 21:46:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:13.835455373 +0000 UTC m=+1085.027973237" watchObservedRunningTime="2026-02-19 21:46:13.841588076 +0000 UTC m=+1085.034105940" Feb 19 21:46:13 crc kubenswrapper[4795]: I0219 21:46:13.877745 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.877719575 podStartE2EDuration="5.877719575s" podCreationTimestamp="2026-02-19 21:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:13.869655738 +0000 UTC m=+1085.062173622" watchObservedRunningTime="2026-02-19 21:46:13.877719575 +0000 UTC m=+1085.070237439" Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.835368 4795 generic.go:334] "Generic (PLEG): container finished" podID="5f9a41f3-9dae-4426-8687-368f5911a834" containerID="a8c2432cd1a201f5714626d8c9b69ee61c26692ea73aa3955e3d0f4d2094d962" exitCode=0 Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.835638 4795 generic.go:334] "Generic (PLEG): container finished" podID="5f9a41f3-9dae-4426-8687-368f5911a834" containerID="cbf59bdd46660f6210995b0eb2fc2126976500e2a6c6cee2fb345b41a51900b2" exitCode=143 Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.835400 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f9a41f3-9dae-4426-8687-368f5911a834","Type":"ContainerDied","Data":"a8c2432cd1a201f5714626d8c9b69ee61c26692ea73aa3955e3d0f4d2094d962"} Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.835706 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f9a41f3-9dae-4426-8687-368f5911a834","Type":"ContainerDied","Data":"cbf59bdd46660f6210995b0eb2fc2126976500e2a6c6cee2fb345b41a51900b2"} Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.838972 4795 generic.go:334] "Generic (PLEG): container finished" podID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerID="ecaa7d4a6273a3af65b1f5d9585a04b8dd375c28b94a4127bae9a591c4a91d3e" exitCode=0 Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.838991 4795 generic.go:334] "Generic (PLEG): container finished" podID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerID="f5cfd3f9869f87af702a2e0e738d0b4feb92ba480863b5a46d242043d700ac19" exitCode=143 Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.839013 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b668591-0c11-42f0-b813-94b76c8cbd1b","Type":"ContainerDied","Data":"ecaa7d4a6273a3af65b1f5d9585a04b8dd375c28b94a4127bae9a591c4a91d3e"} Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.839057 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b668591-0c11-42f0-b813-94b76c8cbd1b","Type":"ContainerDied","Data":"f5cfd3f9869f87af702a2e0e738d0b4feb92ba480863b5a46d242043d700ac19"} Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.840705 4795 generic.go:334] "Generic (PLEG): container finished" podID="60623dff-0241-4bc9-8b17-de61d7271e19" containerID="9ae6b68ead4d625967dc9f8e67a83bcadf072cd6d6b5d617f89d10bb543678b8" exitCode=0 Feb 19 21:46:14 crc kubenswrapper[4795]: I0219 21:46:14.840732 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xp5gn" event={"ID":"60623dff-0241-4bc9-8b17-de61d7271e19","Type":"ContainerDied","Data":"9ae6b68ead4d625967dc9f8e67a83bcadf072cd6d6b5d617f89d10bb543678b8"} Feb 19 21:46:20 crc kubenswrapper[4795]: I0219 21:46:20.098956 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:20 crc kubenswrapper[4795]: I0219 21:46:20.173224 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-5c55b"] Feb 19 21:46:20 crc kubenswrapper[4795]: I0219 21:46:20.173555 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-768666cd57-5c55b" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" containerName="dnsmasq-dns" containerID="cri-o://2ad1ba517bb7be9e1a9d1797c2af47441870593656bca08175b2e33f72c8f9fc" gracePeriod=10 Feb 19 21:46:20 crc kubenswrapper[4795]: I0219 21:46:20.908245 4795 generic.go:334] "Generic (PLEG): container finished" podID="2f384824-f8ad-42d9-b09b-decb5280b448" containerID="2ad1ba517bb7be9e1a9d1797c2af47441870593656bca08175b2e33f72c8f9fc" exitCode=0 Feb 19 21:46:20 crc kubenswrapper[4795]: I0219 21:46:20.908301 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-5c55b" event={"ID":"2f384824-f8ad-42d9-b09b-decb5280b448","Type":"ContainerDied","Data":"2ad1ba517bb7be9e1a9d1797c2af47441870593656bca08175b2e33f72c8f9fc"} Feb 19 21:46:21 crc kubenswrapper[4795]: I0219 21:46:21.924476 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xp5gn" event={"ID":"60623dff-0241-4bc9-8b17-de61d7271e19","Type":"ContainerDied","Data":"5919e2bd3e6456e8a3e11cb00ab17617e176207d63da18ffb2be8092647e71a8"} Feb 19 21:46:21 crc kubenswrapper[4795]: I0219 21:46:21.924745 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5919e2bd3e6456e8a3e11cb00ab17617e176207d63da18ffb2be8092647e71a8" Feb 19 21:46:21 crc kubenswrapper[4795]: I0219 21:46:21.966160 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.041833 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-scripts\") pod \"60623dff-0241-4bc9-8b17-de61d7271e19\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.041912 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-fernet-keys\") pod \"60623dff-0241-4bc9-8b17-de61d7271e19\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.042039 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-config-data\") pod \"60623dff-0241-4bc9-8b17-de61d7271e19\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.042087 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-combined-ca-bundle\") pod \"60623dff-0241-4bc9-8b17-de61d7271e19\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.042132 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkx2w\" (UniqueName: \"kubernetes.io/projected/60623dff-0241-4bc9-8b17-de61d7271e19-kube-api-access-kkx2w\") pod \"60623dff-0241-4bc9-8b17-de61d7271e19\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.042159 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-credential-keys\") pod \"60623dff-0241-4bc9-8b17-de61d7271e19\" (UID: \"60623dff-0241-4bc9-8b17-de61d7271e19\") " Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.049547 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "60623dff-0241-4bc9-8b17-de61d7271e19" (UID: "60623dff-0241-4bc9-8b17-de61d7271e19"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.049687 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "60623dff-0241-4bc9-8b17-de61d7271e19" (UID: "60623dff-0241-4bc9-8b17-de61d7271e19"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.056604 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-scripts" (OuterVolumeSpecName: "scripts") pod "60623dff-0241-4bc9-8b17-de61d7271e19" (UID: "60623dff-0241-4bc9-8b17-de61d7271e19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.056661 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60623dff-0241-4bc9-8b17-de61d7271e19-kube-api-access-kkx2w" (OuterVolumeSpecName: "kube-api-access-kkx2w") pod "60623dff-0241-4bc9-8b17-de61d7271e19" (UID: "60623dff-0241-4bc9-8b17-de61d7271e19"). InnerVolumeSpecName "kube-api-access-kkx2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.082104 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60623dff-0241-4bc9-8b17-de61d7271e19" (UID: "60623dff-0241-4bc9-8b17-de61d7271e19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.087347 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-config-data" (OuterVolumeSpecName: "config-data") pod "60623dff-0241-4bc9-8b17-de61d7271e19" (UID: "60623dff-0241-4bc9-8b17-de61d7271e19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.143838 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.143869 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.143877 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.143885 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.143893 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60623dff-0241-4bc9-8b17-de61d7271e19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.143901 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkx2w\" (UniqueName: \"kubernetes.io/projected/60623dff-0241-4bc9-8b17-de61d7271e19-kube-api-access-kkx2w\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:22 crc kubenswrapper[4795]: I0219 21:46:22.931819 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xp5gn" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.054979 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xp5gn"] Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.064147 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xp5gn"] Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.150546 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-nffrq"] Feb 19 21:46:23 crc kubenswrapper[4795]: E0219 21:46:23.151153 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a3a185-4911-4b2e-ab11-5ea1c61c2b69" containerName="init" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.151199 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a3a185-4911-4b2e-ab11-5ea1c61c2b69" containerName="init" Feb 19 21:46:23 crc kubenswrapper[4795]: E0219 21:46:23.151217 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df3e76b-9aa2-476b-aa19-62518a8ddd1e" containerName="init" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.151225 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df3e76b-9aa2-476b-aa19-62518a8ddd1e" containerName="init" Feb 19 21:46:23 crc kubenswrapper[4795]: E0219 21:46:23.151284 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60623dff-0241-4bc9-8b17-de61d7271e19" containerName="keystone-bootstrap" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.151296 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="60623dff-0241-4bc9-8b17-de61d7271e19" containerName="keystone-bootstrap" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.151695 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df3e76b-9aa2-476b-aa19-62518a8ddd1e" containerName="init" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.151746 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="60623dff-0241-4bc9-8b17-de61d7271e19" containerName="keystone-bootstrap" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.151767 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a3a185-4911-4b2e-ab11-5ea1c61c2b69" containerName="init" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.155511 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.158848 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.159081 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.159079 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.159512 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8pz2x" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.159660 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.180941 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nffrq"] Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.262848 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-fernet-keys\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.262926 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g59s7\" (UniqueName: \"kubernetes.io/projected/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-kube-api-access-g59s7\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.262981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-config-data\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.263050 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-scripts\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.263191 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-credential-keys\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.263337 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-combined-ca-bundle\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.364341 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-fernet-keys\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.364389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g59s7\" (UniqueName: \"kubernetes.io/projected/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-kube-api-access-g59s7\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.364412 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-config-data\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.364462 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-scripts\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.364496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-credential-keys\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.364540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-combined-ca-bundle\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.370746 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-combined-ca-bundle\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.370990 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-fernet-keys\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.371105 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-credential-keys\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.371455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-config-data\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.371885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-scripts\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.381477 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g59s7\" (UniqueName: \"kubernetes.io/projected/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-kube-api-access-g59s7\") pod \"keystone-bootstrap-nffrq\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.477576 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:23 crc kubenswrapper[4795]: I0219 21:46:23.522185 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60623dff-0241-4bc9-8b17-de61d7271e19" path="/var/lib/kubelet/pods/60623dff-0241-4bc9-8b17-de61d7271e19/volumes" Feb 19 21:46:28 crc kubenswrapper[4795]: I0219 21:46:28.428126 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:46:28 crc kubenswrapper[4795]: I0219 21:46:28.428931 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.038755 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-768666cd57-5c55b" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.767794 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.782273 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.870922 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-logs\") pod \"2b668591-0c11-42f0-b813-94b76c8cbd1b\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.870984 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-httpd-run\") pod \"2b668591-0c11-42f0-b813-94b76c8cbd1b\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871009 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-scripts\") pod \"2b668591-0c11-42f0-b813-94b76c8cbd1b\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871047 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-config-data\") pod \"5f9a41f3-9dae-4426-8687-368f5911a834\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871067 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-logs\") pod \"5f9a41f3-9dae-4426-8687-368f5911a834\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871098 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"2b668591-0c11-42f0-b813-94b76c8cbd1b\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871120 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-combined-ca-bundle\") pod \"2b668591-0c11-42f0-b813-94b76c8cbd1b\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871151 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-scripts\") pod \"5f9a41f3-9dae-4426-8687-368f5911a834\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871199 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2rcc\" (UniqueName: \"kubernetes.io/projected/2b668591-0c11-42f0-b813-94b76c8cbd1b-kube-api-access-v2rcc\") pod \"2b668591-0c11-42f0-b813-94b76c8cbd1b\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871219 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-config-data\") pod \"2b668591-0c11-42f0-b813-94b76c8cbd1b\" (UID: \"2b668591-0c11-42f0-b813-94b76c8cbd1b\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871251 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-httpd-run\") pod \"5f9a41f3-9dae-4426-8687-368f5911a834\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871301 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9x6w\" (UniqueName: \"kubernetes.io/projected/5f9a41f3-9dae-4426-8687-368f5911a834-kube-api-access-r9x6w\") pod \"5f9a41f3-9dae-4426-8687-368f5911a834\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871330 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"5f9a41f3-9dae-4426-8687-368f5911a834\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.871357 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-combined-ca-bundle\") pod \"5f9a41f3-9dae-4426-8687-368f5911a834\" (UID: \"5f9a41f3-9dae-4426-8687-368f5911a834\") " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.872878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5f9a41f3-9dae-4426-8687-368f5911a834" (UID: "5f9a41f3-9dae-4426-8687-368f5911a834"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.873063 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-logs" (OuterVolumeSpecName: "logs") pod "2b668591-0c11-42f0-b813-94b76c8cbd1b" (UID: "2b668591-0c11-42f0-b813-94b76c8cbd1b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.873246 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2b668591-0c11-42f0-b813-94b76c8cbd1b" (UID: "2b668591-0c11-42f0-b813-94b76c8cbd1b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.873402 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-logs" (OuterVolumeSpecName: "logs") pod "5f9a41f3-9dae-4426-8687-368f5911a834" (UID: "5f9a41f3-9dae-4426-8687-368f5911a834"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.878458 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b668591-0c11-42f0-b813-94b76c8cbd1b-kube-api-access-v2rcc" (OuterVolumeSpecName: "kube-api-access-v2rcc") pod "2b668591-0c11-42f0-b813-94b76c8cbd1b" (UID: "2b668591-0c11-42f0-b813-94b76c8cbd1b"). InnerVolumeSpecName "kube-api-access-v2rcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.878937 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-scripts" (OuterVolumeSpecName: "scripts") pod "5f9a41f3-9dae-4426-8687-368f5911a834" (UID: "5f9a41f3-9dae-4426-8687-368f5911a834"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.879042 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "5f9a41f3-9dae-4426-8687-368f5911a834" (UID: "5f9a41f3-9dae-4426-8687-368f5911a834"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.879534 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9a41f3-9dae-4426-8687-368f5911a834-kube-api-access-r9x6w" (OuterVolumeSpecName: "kube-api-access-r9x6w") pod "5f9a41f3-9dae-4426-8687-368f5911a834" (UID: "5f9a41f3-9dae-4426-8687-368f5911a834"). InnerVolumeSpecName "kube-api-access-r9x6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.881486 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "2b668591-0c11-42f0-b813-94b76c8cbd1b" (UID: "2b668591-0c11-42f0-b813-94b76c8cbd1b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.892484 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-scripts" (OuterVolumeSpecName: "scripts") pod "2b668591-0c11-42f0-b813-94b76c8cbd1b" (UID: "2b668591-0c11-42f0-b813-94b76c8cbd1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.898805 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b668591-0c11-42f0-b813-94b76c8cbd1b" (UID: "2b668591-0c11-42f0-b813-94b76c8cbd1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.900639 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f9a41f3-9dae-4426-8687-368f5911a834" (UID: "5f9a41f3-9dae-4426-8687-368f5911a834"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.923266 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-config-data" (OuterVolumeSpecName: "config-data") pod "5f9a41f3-9dae-4426-8687-368f5911a834" (UID: "5f9a41f3-9dae-4426-8687-368f5911a834"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.925748 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-config-data" (OuterVolumeSpecName: "config-data") pod "2b668591-0c11-42f0-b813-94b76c8cbd1b" (UID: "2b668591-0c11-42f0-b813-94b76c8cbd1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.973831 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.973867 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.973922 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.973933 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.973944 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.973952 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2rcc\" (UniqueName: \"kubernetes.io/projected/2b668591-0c11-42f0-b813-94b76c8cbd1b-kube-api-access-v2rcc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.973987 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.973994 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f9a41f3-9dae-4426-8687-368f5911a834-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.974003 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9x6w\" (UniqueName: \"kubernetes.io/projected/5f9a41f3-9dae-4426-8687-368f5911a834-kube-api-access-r9x6w\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.974018 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.974026 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f9a41f3-9dae-4426-8687-368f5911a834-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.974034 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.974043 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b668591-0c11-42f0-b813-94b76c8cbd1b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.974054 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b668591-0c11-42f0-b813-94b76c8cbd1b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.986067 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f9a41f3-9dae-4426-8687-368f5911a834","Type":"ContainerDied","Data":"74e861972e0b08341b7578c99f45877ca28dee930187dfc7c5d68702065ba963"} Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.986377 4795 scope.go:117] "RemoveContainer" containerID="a8c2432cd1a201f5714626d8c9b69ee61c26692ea73aa3955e3d0f4d2094d962" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.986079 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.992461 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2b668591-0c11-42f0-b813-94b76c8cbd1b","Type":"ContainerDied","Data":"2d8f5079a7199ad590b8e7ae61509c061499525039c4df8317fdb2a9d4218e70"} Feb 19 21:46:29 crc kubenswrapper[4795]: I0219 21:46:29.992581 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.000326 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.002796 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.020473 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.027050 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.075450 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.075488 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089288 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:30 crc kubenswrapper[4795]: E0219 21:46:30.089693 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerName="glance-log" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089710 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerName="glance-log" Feb 19 21:46:30 crc kubenswrapper[4795]: E0219 21:46:30.089722 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerName="glance-httpd" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089728 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerName="glance-httpd" Feb 19 21:46:30 crc kubenswrapper[4795]: E0219 21:46:30.089742 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" containerName="glance-httpd" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089748 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" containerName="glance-httpd" Feb 19 21:46:30 crc kubenswrapper[4795]: E0219 21:46:30.089757 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" containerName="glance-log" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089763 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" containerName="glance-log" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089936 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerName="glance-httpd" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089949 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" containerName="glance-httpd" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089967 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" containerName="glance-log" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.089976 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" containerName="glance-log" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.090827 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.093851 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.097642 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.097694 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.099309 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.105904 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6rhfn" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.112937 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.125319 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.141750 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.143336 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.149774 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.149951 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.157367 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.180358 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.180423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-logs\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.180476 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.180555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.180626 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.180752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.180804 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.180852 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zp5z\" (UniqueName: \"kubernetes.io/projected/b449064d-5c14-4362-ba7b-a24ee9292789-kube-api-access-7zp5z\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284572 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284632 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284650 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zp5z\" (UniqueName: \"kubernetes.io/projected/b449064d-5c14-4362-ba7b-a24ee9292789-kube-api-access-7zp5z\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284698 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284721 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284747 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-logs\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284788 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284816 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284831 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284848 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-logs\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284864 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284882 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284903 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz2gw\" (UniqueName: \"kubernetes.io/projected/ac1caac2-edf5-453d-a76d-e1c65b7f038b-kube-api-access-hz2gw\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284925 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.284946 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.286372 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.286435 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.286709 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-logs\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.290805 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.292752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.293648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.300887 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.317437 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zp5z\" (UniqueName: \"kubernetes.io/projected/b449064d-5c14-4362-ba7b-a24ee9292789-kube-api-access-7zp5z\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.338055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.386811 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.386860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.386884 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-logs\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.386913 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.386939 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz2gw\" (UniqueName: \"kubernetes.io/projected/ac1caac2-edf5-453d-a76d-e1c65b7f038b-kube-api-access-hz2gw\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.386983 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.387089 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.387115 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.387492 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.388190 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.388421 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-logs\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.392293 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-config-data\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.393064 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-scripts\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.393200 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.393919 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.404932 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz2gw\" (UniqueName: \"kubernetes.io/projected/ac1caac2-edf5-453d-a76d-e1c65b7f038b-kube-api-access-hz2gw\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.413772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " pod="openstack/glance-default-external-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.434469 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:30 crc kubenswrapper[4795]: I0219 21:46:30.472635 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:46:31 crc kubenswrapper[4795]: E0219 21:46:31.194364 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b" Feb 19 21:46:31 crc kubenswrapper[4795]: E0219 21:46:31.194544 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qbqsc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-zjbsw_openstack(5084e7b9-4923-449e-b0d7-28c602faeff0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 21:46:31 crc kubenswrapper[4795]: E0219 21:46:31.195878 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-zjbsw" podUID="5084e7b9-4923-449e-b0d7-28c602faeff0" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.204924 4795 scope.go:117] "RemoveContainer" containerID="cbf59bdd46660f6210995b0eb2fc2126976500e2a6c6cee2fb345b41a51900b2" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.352030 4795 scope.go:117] "RemoveContainer" containerID="ecaa7d4a6273a3af65b1f5d9585a04b8dd375c28b94a4127bae9a591c4a91d3e" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.436505 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.455929 4795 scope.go:117] "RemoveContainer" containerID="f5cfd3f9869f87af702a2e0e738d0b4feb92ba480863b5a46d242043d700ac19" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.525388 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b668591-0c11-42f0-b813-94b76c8cbd1b" path="/var/lib/kubelet/pods/2b668591-0c11-42f0-b813-94b76c8cbd1b/volumes" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.526734 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f9a41f3-9dae-4426-8687-368f5911a834" path="/var/lib/kubelet/pods/5f9a41f3-9dae-4426-8687-368f5911a834/volumes" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.547870 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-sb\") pod \"2f384824-f8ad-42d9-b09b-decb5280b448\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.548335 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-swift-storage-0\") pod \"2f384824-f8ad-42d9-b09b-decb5280b448\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.548372 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w95wm\" (UniqueName: \"kubernetes.io/projected/2f384824-f8ad-42d9-b09b-decb5280b448-kube-api-access-w95wm\") pod \"2f384824-f8ad-42d9-b09b-decb5280b448\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.548397 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-nb\") pod \"2f384824-f8ad-42d9-b09b-decb5280b448\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.548726 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-config\") pod \"2f384824-f8ad-42d9-b09b-decb5280b448\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.548759 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-svc\") pod \"2f384824-f8ad-42d9-b09b-decb5280b448\" (UID: \"2f384824-f8ad-42d9-b09b-decb5280b448\") " Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.554277 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f384824-f8ad-42d9-b09b-decb5280b448-kube-api-access-w95wm" (OuterVolumeSpecName: "kube-api-access-w95wm") pod "2f384824-f8ad-42d9-b09b-decb5280b448" (UID: "2f384824-f8ad-42d9-b09b-decb5280b448"). InnerVolumeSpecName "kube-api-access-w95wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.594341 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-config" (OuterVolumeSpecName: "config") pod "2f384824-f8ad-42d9-b09b-decb5280b448" (UID: "2f384824-f8ad-42d9-b09b-decb5280b448"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.594921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2f384824-f8ad-42d9-b09b-decb5280b448" (UID: "2f384824-f8ad-42d9-b09b-decb5280b448"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.595689 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f384824-f8ad-42d9-b09b-decb5280b448" (UID: "2f384824-f8ad-42d9-b09b-decb5280b448"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.599543 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2f384824-f8ad-42d9-b09b-decb5280b448" (UID: "2f384824-f8ad-42d9-b09b-decb5280b448"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.605775 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2f384824-f8ad-42d9-b09b-decb5280b448" (UID: "2f384824-f8ad-42d9-b09b-decb5280b448"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.650586 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.650621 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.650630 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.650640 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w95wm\" (UniqueName: \"kubernetes.io/projected/2f384824-f8ad-42d9-b09b-decb5280b448-kube-api-access-w95wm\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.650648 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.650657 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f384824-f8ad-42d9-b09b-decb5280b448-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.695630 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nffrq"] Feb 19 21:46:31 crc kubenswrapper[4795]: W0219 21:46:31.696694 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2a7b298_40b6_43b3_9099_ec74f2f0bfad.slice/crio-476e1da2e4d000d91846aabf81bf70ab643e3d012d9f8cc8cadac543daeaf6ad WatchSource:0}: Error finding container 476e1da2e4d000d91846aabf81bf70ab643e3d012d9f8cc8cadac543daeaf6ad: Status 404 returned error can't find the container with id 476e1da2e4d000d91846aabf81bf70ab643e3d012d9f8cc8cadac543daeaf6ad Feb 19 21:46:31 crc kubenswrapper[4795]: I0219 21:46:31.866159 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.009903 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ttz5x" event={"ID":"e61f40e0-d6c3-49f7-a93f-d9956f086d4b","Type":"ContainerStarted","Data":"76c63f1533e7c3fa5624057c7772474f1729c45e568e970f6a98d3e337ab74ae"} Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.011671 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerStarted","Data":"b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81"} Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.013650 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jkspq" event={"ID":"454af6b2-4c9e-4706-a537-b3e3d468353d","Type":"ContainerStarted","Data":"a55b1033c9d990cf35d1c92321ef2ffff86d5c602754349685623befa9e70257"} Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.015356 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nffrq" event={"ID":"a2a7b298-40b6-43b3-9099-ec74f2f0bfad","Type":"ContainerStarted","Data":"da621ffa210adf5451b6eb9d78a9cbed2608c7b5e2b1a9ce00eba1c1e65c6ec4"} Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.015380 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nffrq" event={"ID":"a2a7b298-40b6-43b3-9099-ec74f2f0bfad","Type":"ContainerStarted","Data":"476e1da2e4d000d91846aabf81bf70ab643e3d012d9f8cc8cadac543daeaf6ad"} Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.018421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-5c55b" event={"ID":"2f384824-f8ad-42d9-b09b-decb5280b448","Type":"ContainerDied","Data":"13682a647f5a86053553d890ba06f71e1b6b17acba7850bc7912292cc1d6509d"} Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.018450 4795 scope.go:117] "RemoveContainer" containerID="2ad1ba517bb7be9e1a9d1797c2af47441870593656bca08175b2e33f72c8f9fc" Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.018538 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-5c55b" Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.024896 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-ttz5x" podStartSLOduration=2.465339131 podStartE2EDuration="23.024878702s" podCreationTimestamp="2026-02-19 21:46:09 +0000 UTC" firstStartedPulling="2026-02-19 21:46:10.648954286 +0000 UTC m=+1081.841472150" lastFinishedPulling="2026-02-19 21:46:31.208493857 +0000 UTC m=+1102.401011721" observedRunningTime="2026-02-19 21:46:32.024693486 +0000 UTC m=+1103.217211350" watchObservedRunningTime="2026-02-19 21:46:32.024878702 +0000 UTC m=+1103.217396566" Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.038769 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b449064d-5c14-4362-ba7b-a24ee9292789","Type":"ContainerStarted","Data":"fe3c35dfb7e24d88f06d64c3416cafe5c8ebe7a75022634f77450664da8f2158"} Feb 19 21:46:32 crc kubenswrapper[4795]: E0219 21:46:32.045136 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b\\\"\"" pod="openstack/cinder-db-sync-zjbsw" podUID="5084e7b9-4923-449e-b0d7-28c602faeff0" Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.053032 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jkspq" podStartSLOduration=2.538526204 podStartE2EDuration="23.053013725s" podCreationTimestamp="2026-02-19 21:46:09 +0000 UTC" firstStartedPulling="2026-02-19 21:46:10.651319762 +0000 UTC m=+1081.843837636" lastFinishedPulling="2026-02-19 21:46:31.165807293 +0000 UTC m=+1102.358325157" observedRunningTime="2026-02-19 21:46:32.042019035 +0000 UTC m=+1103.234536899" watchObservedRunningTime="2026-02-19 21:46:32.053013725 +0000 UTC m=+1103.245531589" Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.067280 4795 scope.go:117] "RemoveContainer" containerID="269c23977ee5c8c89a33c0c6142172272e66947879bd10de823a2ea6ed071bb6" Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.067326 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-nffrq" podStartSLOduration=9.067304598 podStartE2EDuration="9.067304598s" podCreationTimestamp="2026-02-19 21:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:32.058948142 +0000 UTC m=+1103.251466016" watchObservedRunningTime="2026-02-19 21:46:32.067304598 +0000 UTC m=+1103.259822482" Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.098986 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-5c55b"] Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.105537 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-5c55b"] Feb 19 21:46:32 crc kubenswrapper[4795]: I0219 21:46:32.435446 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:46:32 crc kubenswrapper[4795]: W0219 21:46:32.622977 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac1caac2_edf5_453d_a76d_e1c65b7f038b.slice/crio-3b74237da4db43dc7c546ec4709a7afdab3dbcd047ed3e21f44f8f9ee5a66753 WatchSource:0}: Error finding container 3b74237da4db43dc7c546ec4709a7afdab3dbcd047ed3e21f44f8f9ee5a66753: Status 404 returned error can't find the container with id 3b74237da4db43dc7c546ec4709a7afdab3dbcd047ed3e21f44f8f9ee5a66753 Feb 19 21:46:33 crc kubenswrapper[4795]: I0219 21:46:33.052618 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerStarted","Data":"e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd"} Feb 19 21:46:33 crc kubenswrapper[4795]: I0219 21:46:33.053998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac1caac2-edf5-453d-a76d-e1c65b7f038b","Type":"ContainerStarted","Data":"3b74237da4db43dc7c546ec4709a7afdab3dbcd047ed3e21f44f8f9ee5a66753"} Feb 19 21:46:33 crc kubenswrapper[4795]: I0219 21:46:33.055904 4795 generic.go:334] "Generic (PLEG): container finished" podID="1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec" containerID="b0effc166b07beb1020b33c2901a97075e3341db6cb3398e72144f393ccb6850" exitCode=0 Feb 19 21:46:33 crc kubenswrapper[4795]: I0219 21:46:33.055939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b4bcd" event={"ID":"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec","Type":"ContainerDied","Data":"b0effc166b07beb1020b33c2901a97075e3341db6cb3398e72144f393ccb6850"} Feb 19 21:46:33 crc kubenswrapper[4795]: I0219 21:46:33.060367 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b449064d-5c14-4362-ba7b-a24ee9292789","Type":"ContainerStarted","Data":"1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a"} Feb 19 21:46:33 crc kubenswrapper[4795]: I0219 21:46:33.527131 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" path="/var/lib/kubelet/pods/2f384824-f8ad-42d9-b09b-decb5280b448/volumes" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.039301 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-768666cd57-5c55b" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.071517 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b449064d-5c14-4362-ba7b-a24ee9292789","Type":"ContainerStarted","Data":"24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d"} Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.075535 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac1caac2-edf5-453d-a76d-e1c65b7f038b","Type":"ContainerStarted","Data":"32a8f97c4bdeeea5fdb4b24c48b5ee8e3dda515976e342ae4939a42ab5261eec"} Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.075588 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac1caac2-edf5-453d-a76d-e1c65b7f038b","Type":"ContainerStarted","Data":"7826593332e6fe0d625ec77b566a78383702574fe609c8ca89088f745857f981"} Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.077574 4795 generic.go:334] "Generic (PLEG): container finished" podID="454af6b2-4c9e-4706-a537-b3e3d468353d" containerID="a55b1033c9d990cf35d1c92321ef2ffff86d5c602754349685623befa9e70257" exitCode=0 Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.077622 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jkspq" event={"ID":"454af6b2-4c9e-4706-a537-b3e3d468353d","Type":"ContainerDied","Data":"a55b1033c9d990cf35d1c92321ef2ffff86d5c602754349685623befa9e70257"} Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.106923 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.10690428 podStartE2EDuration="4.10690428s" podCreationTimestamp="2026-02-19 21:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:34.094620163 +0000 UTC m=+1105.287138057" watchObservedRunningTime="2026-02-19 21:46:34.10690428 +0000 UTC m=+1105.299422144" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.128772 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.128747196 podStartE2EDuration="4.128747196s" podCreationTimestamp="2026-02-19 21:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:34.122360496 +0000 UTC m=+1105.314878380" watchObservedRunningTime="2026-02-19 21:46:34.128747196 +0000 UTC m=+1105.321265060" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.495098 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.608132 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-config\") pod \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.608206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6xqk\" (UniqueName: \"kubernetes.io/projected/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-kube-api-access-c6xqk\") pod \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.608237 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-combined-ca-bundle\") pod \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\" (UID: \"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec\") " Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.620453 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-kube-api-access-c6xqk" (OuterVolumeSpecName: "kube-api-access-c6xqk") pod "1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec" (UID: "1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec"). InnerVolumeSpecName "kube-api-access-c6xqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.638095 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-config" (OuterVolumeSpecName: "config") pod "1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec" (UID: "1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.644410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec" (UID: "1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.710895 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6xqk\" (UniqueName: \"kubernetes.io/projected/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-kube-api-access-c6xqk\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.710951 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:34 crc kubenswrapper[4795]: I0219 21:46:34.710974 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.089463 4795 generic.go:334] "Generic (PLEG): container finished" podID="e61f40e0-d6c3-49f7-a93f-d9956f086d4b" containerID="76c63f1533e7c3fa5624057c7772474f1729c45e568e970f6a98d3e337ab74ae" exitCode=0 Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.089517 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ttz5x" event={"ID":"e61f40e0-d6c3-49f7-a93f-d9956f086d4b","Type":"ContainerDied","Data":"76c63f1533e7c3fa5624057c7772474f1729c45e568e970f6a98d3e337ab74ae"} Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.091254 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b4bcd" event={"ID":"1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec","Type":"ContainerDied","Data":"9f412f3971266eb4c52987ab5a1d19a79a80d58662b3e55c9fe1df068708db16"} Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.091280 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f412f3971266eb4c52987ab5a1d19a79a80d58662b3e55c9fe1df068708db16" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.091372 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b4bcd" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.104287 4795 generic.go:334] "Generic (PLEG): container finished" podID="a2a7b298-40b6-43b3-9099-ec74f2f0bfad" containerID="da621ffa210adf5451b6eb9d78a9cbed2608c7b5e2b1a9ce00eba1c1e65c6ec4" exitCode=0 Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.104409 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nffrq" event={"ID":"a2a7b298-40b6-43b3-9099-ec74f2f0bfad","Type":"ContainerDied","Data":"da621ffa210adf5451b6eb9d78a9cbed2608c7b5e2b1a9ce00eba1c1e65c6ec4"} Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.318120 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-vsr59"] Feb 19 21:46:35 crc kubenswrapper[4795]: E0219 21:46:35.318516 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" containerName="init" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.318530 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" containerName="init" Feb 19 21:46:35 crc kubenswrapper[4795]: E0219 21:46:35.318546 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" containerName="dnsmasq-dns" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.318553 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" containerName="dnsmasq-dns" Feb 19 21:46:35 crc kubenswrapper[4795]: E0219 21:46:35.318562 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec" containerName="neutron-db-sync" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.318568 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec" containerName="neutron-db-sync" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.318703 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f384824-f8ad-42d9-b09b-decb5280b448" containerName="dnsmasq-dns" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.318728 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec" containerName="neutron-db-sync" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.319619 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.332985 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-vsr59"] Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.424900 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-nb\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.424979 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-sb\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.425020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-swift-storage-0\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.425038 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rcql\" (UniqueName: \"kubernetes.io/projected/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-kube-api-access-9rcql\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.425109 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-svc\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.425231 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-config\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.478383 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-78d7c97684-8rgnf"] Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.480606 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.483565 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.483669 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.483755 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.483855 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nf9cn" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.525022 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78d7c97684-8rgnf"] Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526248 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-ovndb-tls-certs\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526295 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp5fp\" (UniqueName: \"kubernetes.io/projected/2af40d0f-93fe-4592-a07b-0cee3eefbde5-kube-api-access-kp5fp\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526323 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-config\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526354 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-config\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526378 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-combined-ca-bundle\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526398 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-nb\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526423 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-sb\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-httpd-config\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526469 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-swift-storage-0\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526484 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rcql\" (UniqueName: \"kubernetes.io/projected/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-kube-api-access-9rcql\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.526505 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-svc\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.527662 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-svc\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.528985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-sb\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.529503 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-nb\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.529971 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-swift-storage-0\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.531456 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-config\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.549951 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rcql\" (UniqueName: \"kubernetes.io/projected/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-kube-api-access-9rcql\") pod \"dnsmasq-dns-db5c97f8f-vsr59\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.618829 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.628129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-ovndb-tls-certs\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.628544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp5fp\" (UniqueName: \"kubernetes.io/projected/2af40d0f-93fe-4592-a07b-0cee3eefbde5-kube-api-access-kp5fp\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.628584 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-config\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.628623 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-combined-ca-bundle\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.628650 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-httpd-config\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.634726 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-httpd-config\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.641032 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-combined-ca-bundle\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.642942 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-ovndb-tls-certs\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.643419 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.647623 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-config\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.660930 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp5fp\" (UniqueName: \"kubernetes.io/projected/2af40d0f-93fe-4592-a07b-0cee3eefbde5-kube-api-access-kp5fp\") pod \"neutron-78d7c97684-8rgnf\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.730241 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/454af6b2-4c9e-4706-a537-b3e3d468353d-logs\") pod \"454af6b2-4c9e-4706-a537-b3e3d468353d\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.730363 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppjmb\" (UniqueName: \"kubernetes.io/projected/454af6b2-4c9e-4706-a537-b3e3d468353d-kube-api-access-ppjmb\") pod \"454af6b2-4c9e-4706-a537-b3e3d468353d\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.730476 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-combined-ca-bundle\") pod \"454af6b2-4c9e-4706-a537-b3e3d468353d\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.730529 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-scripts\") pod \"454af6b2-4c9e-4706-a537-b3e3d468353d\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.730588 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-config-data\") pod \"454af6b2-4c9e-4706-a537-b3e3d468353d\" (UID: \"454af6b2-4c9e-4706-a537-b3e3d468353d\") " Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.732233 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/454af6b2-4c9e-4706-a537-b3e3d468353d-logs" (OuterVolumeSpecName: "logs") pod "454af6b2-4c9e-4706-a537-b3e3d468353d" (UID: "454af6b2-4c9e-4706-a537-b3e3d468353d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.737408 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-scripts" (OuterVolumeSpecName: "scripts") pod "454af6b2-4c9e-4706-a537-b3e3d468353d" (UID: "454af6b2-4c9e-4706-a537-b3e3d468353d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.737718 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/454af6b2-4c9e-4706-a537-b3e3d468353d-kube-api-access-ppjmb" (OuterVolumeSpecName: "kube-api-access-ppjmb") pod "454af6b2-4c9e-4706-a537-b3e3d468353d" (UID: "454af6b2-4c9e-4706-a537-b3e3d468353d"). InnerVolumeSpecName "kube-api-access-ppjmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.788378 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "454af6b2-4c9e-4706-a537-b3e3d468353d" (UID: "454af6b2-4c9e-4706-a537-b3e3d468353d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.793474 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-config-data" (OuterVolumeSpecName: "config-data") pod "454af6b2-4c9e-4706-a537-b3e3d468353d" (UID: "454af6b2-4c9e-4706-a537-b3e3d468353d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.812511 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.833657 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.833690 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.833700 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/454af6b2-4c9e-4706-a537-b3e3d468353d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.833712 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/454af6b2-4c9e-4706-a537-b3e3d468353d-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:35 crc kubenswrapper[4795]: I0219 21:46:35.833722 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppjmb\" (UniqueName: \"kubernetes.io/projected/454af6b2-4c9e-4706-a537-b3e3d468353d-kube-api-access-ppjmb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.121046 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jkspq" event={"ID":"454af6b2-4c9e-4706-a537-b3e3d468353d","Type":"ContainerDied","Data":"2f56b820ecab38a5f4f0b9d439b2a430292837a387794fcf0430bf0243ab99f1"} Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.121512 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f56b820ecab38a5f4f0b9d439b2a430292837a387794fcf0430bf0243ab99f1" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.121397 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jkspq" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.148893 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-vsr59"] Feb 19 21:46:36 crc kubenswrapper[4795]: W0219 21:46:36.161440 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b5d1bab_ed1a_4a5e_a194_ad6b25d16210.slice/crio-d1d8cd14d0dca706ef7e60716badbe8723a3cc32c8f0ce72a5ad49bcb63f1b70 WatchSource:0}: Error finding container d1d8cd14d0dca706ef7e60716badbe8723a3cc32c8f0ce72a5ad49bcb63f1b70: Status 404 returned error can't find the container with id d1d8cd14d0dca706ef7e60716badbe8723a3cc32c8f0ce72a5ad49bcb63f1b70 Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.283932 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-76bfdcc9c4-d56mx"] Feb 19 21:46:36 crc kubenswrapper[4795]: E0219 21:46:36.284318 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="454af6b2-4c9e-4706-a537-b3e3d468353d" containerName="placement-db-sync" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.284335 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="454af6b2-4c9e-4706-a537-b3e3d468353d" containerName="placement-db-sync" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.284488 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="454af6b2-4c9e-4706-a537-b3e3d468353d" containerName="placement-db-sync" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.285343 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.287295 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.287739 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.287831 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.288006 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-bkmsl" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.290111 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.303580 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76bfdcc9c4-d56mx"] Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.355829 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-scripts\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.355919 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdw57\" (UniqueName: \"kubernetes.io/projected/bd5855e1-cadb-4170-8339-5f10945c6ce9-kube-api-access-fdw57\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.355944 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-public-tls-certs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.355999 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-combined-ca-bundle\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.356425 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-internal-tls-certs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.356696 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5855e1-cadb-4170-8339-5f10945c6ce9-logs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.356811 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-config-data\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.445608 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-78d7c97684-8rgnf"] Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.466700 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5855e1-cadb-4170-8339-5f10945c6ce9-logs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.466814 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-config-data\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.466921 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-scripts\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.467434 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5855e1-cadb-4170-8339-5f10945c6ce9-logs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.467976 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdw57\" (UniqueName: \"kubernetes.io/projected/bd5855e1-cadb-4170-8339-5f10945c6ce9-kube-api-access-fdw57\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.468017 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-public-tls-certs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.468055 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-combined-ca-bundle\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.468193 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-internal-tls-certs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.475937 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-public-tls-certs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.475969 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-combined-ca-bundle\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.476355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-config-data\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.479125 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-internal-tls-certs\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.479523 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-scripts\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: W0219 21:46:36.481513 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2af40d0f_93fe_4592_a07b_0cee3eefbde5.slice/crio-0c9496366294be7a896bc72ec610b06b4b589bece135bb9b445591c9fe5a825f WatchSource:0}: Error finding container 0c9496366294be7a896bc72ec610b06b4b589bece135bb9b445591c9fe5a825f: Status 404 returned error can't find the container with id 0c9496366294be7a896bc72ec610b06b4b589bece135bb9b445591c9fe5a825f Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.487625 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdw57\" (UniqueName: \"kubernetes.io/projected/bd5855e1-cadb-4170-8339-5f10945c6ce9-kube-api-access-fdw57\") pod \"placement-76bfdcc9c4-d56mx\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.617832 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.816991 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.820259 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.875756 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-db-sync-config-data\") pod \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.875795 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-credential-keys\") pod \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.875831 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-config-data\") pod \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.875878 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g59s7\" (UniqueName: \"kubernetes.io/projected/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-kube-api-access-g59s7\") pod \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.875919 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-fernet-keys\") pod \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.875945 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-combined-ca-bundle\") pod \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.875987 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-combined-ca-bundle\") pod \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.876026 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7sch\" (UniqueName: \"kubernetes.io/projected/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-kube-api-access-k7sch\") pod \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\" (UID: \"e61f40e0-d6c3-49f7-a93f-d9956f086d4b\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.876095 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-scripts\") pod \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\" (UID: \"a2a7b298-40b6-43b3-9099-ec74f2f0bfad\") " Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.888997 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a2a7b298-40b6-43b3-9099-ec74f2f0bfad" (UID: "a2a7b298-40b6-43b3-9099-ec74f2f0bfad"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.890114 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-scripts" (OuterVolumeSpecName: "scripts") pod "a2a7b298-40b6-43b3-9099-ec74f2f0bfad" (UID: "a2a7b298-40b6-43b3-9099-ec74f2f0bfad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.890548 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e61f40e0-d6c3-49f7-a93f-d9956f086d4b" (UID: "e61f40e0-d6c3-49f7-a93f-d9956f086d4b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.890908 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-kube-api-access-g59s7" (OuterVolumeSpecName: "kube-api-access-g59s7") pod "a2a7b298-40b6-43b3-9099-ec74f2f0bfad" (UID: "a2a7b298-40b6-43b3-9099-ec74f2f0bfad"). InnerVolumeSpecName "kube-api-access-g59s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.892641 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-kube-api-access-k7sch" (OuterVolumeSpecName: "kube-api-access-k7sch") pod "e61f40e0-d6c3-49f7-a93f-d9956f086d4b" (UID: "e61f40e0-d6c3-49f7-a93f-d9956f086d4b"). InnerVolumeSpecName "kube-api-access-k7sch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.899522 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a2a7b298-40b6-43b3-9099-ec74f2f0bfad" (UID: "a2a7b298-40b6-43b3-9099-ec74f2f0bfad"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.933759 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-config-data" (OuterVolumeSpecName: "config-data") pod "a2a7b298-40b6-43b3-9099-ec74f2f0bfad" (UID: "a2a7b298-40b6-43b3-9099-ec74f2f0bfad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.942235 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2a7b298-40b6-43b3-9099-ec74f2f0bfad" (UID: "a2a7b298-40b6-43b3-9099-ec74f2f0bfad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.954858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e61f40e0-d6c3-49f7-a93f-d9956f086d4b" (UID: "e61f40e0-d6c3-49f7-a93f-d9956f086d4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979636 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979684 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979695 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979705 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7sch\" (UniqueName: \"kubernetes.io/projected/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-kube-api-access-k7sch\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979713 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979721 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e61f40e0-d6c3-49f7-a93f-d9956f086d4b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979730 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979737 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:36 crc kubenswrapper[4795]: I0219 21:46:36.979745 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g59s7\" (UniqueName: \"kubernetes.io/projected/a2a7b298-40b6-43b3-9099-ec74f2f0bfad-kube-api-access-g59s7\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.097253 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76bfdcc9c4-d56mx"] Feb 19 21:46:37 crc kubenswrapper[4795]: W0219 21:46:37.106332 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd5855e1_cadb_4170_8339_5f10945c6ce9.slice/crio-fd2853bf461ad171b9d2aba20648d237de7c6fd1d48533d33bcc6a56c0d7fd46 WatchSource:0}: Error finding container fd2853bf461ad171b9d2aba20648d237de7c6fd1d48533d33bcc6a56c0d7fd46: Status 404 returned error can't find the container with id fd2853bf461ad171b9d2aba20648d237de7c6fd1d48533d33bcc6a56c0d7fd46 Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.135613 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nffrq" event={"ID":"a2a7b298-40b6-43b3-9099-ec74f2f0bfad","Type":"ContainerDied","Data":"476e1da2e4d000d91846aabf81bf70ab643e3d012d9f8cc8cadac543daeaf6ad"} Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.135668 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="476e1da2e4d000d91846aabf81bf70ab643e3d012d9f8cc8cadac543daeaf6ad" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.135680 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nffrq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.137921 4795 generic.go:334] "Generic (PLEG): container finished" podID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" containerID="c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185" exitCode=0 Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.137984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" event={"ID":"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210","Type":"ContainerDied","Data":"c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185"} Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.138009 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" event={"ID":"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210","Type":"ContainerStarted","Data":"d1d8cd14d0dca706ef7e60716badbe8723a3cc32c8f0ce72a5ad49bcb63f1b70"} Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.144777 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d7c97684-8rgnf" event={"ID":"2af40d0f-93fe-4592-a07b-0cee3eefbde5","Type":"ContainerStarted","Data":"2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9"} Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.144833 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d7c97684-8rgnf" event={"ID":"2af40d0f-93fe-4592-a07b-0cee3eefbde5","Type":"ContainerStarted","Data":"45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393"} Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.144848 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d7c97684-8rgnf" event={"ID":"2af40d0f-93fe-4592-a07b-0cee3eefbde5","Type":"ContainerStarted","Data":"0c9496366294be7a896bc72ec610b06b4b589bece135bb9b445591c9fe5a825f"} Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.145001 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.146441 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76bfdcc9c4-d56mx" event={"ID":"bd5855e1-cadb-4170-8339-5f10945c6ce9","Type":"ContainerStarted","Data":"fd2853bf461ad171b9d2aba20648d237de7c6fd1d48533d33bcc6a56c0d7fd46"} Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.175704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ttz5x" event={"ID":"e61f40e0-d6c3-49f7-a93f-d9956f086d4b","Type":"ContainerDied","Data":"e8b62c3b7599823f742eaf7d07a077b7a0c11e4d7c8b7810bc63fd0eb218d413"} Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.175739 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8b62c3b7599823f742eaf7d07a077b7a0c11e4d7c8b7810bc63fd0eb218d413" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.175800 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ttz5x" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.192912 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-78d7c97684-8rgnf" podStartSLOduration=2.192895103 podStartE2EDuration="2.192895103s" podCreationTimestamp="2026-02-19 21:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:37.190768123 +0000 UTC m=+1108.383286007" watchObservedRunningTime="2026-02-19 21:46:37.192895103 +0000 UTC m=+1108.385412967" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.235779 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6945f64f65-rnq2b"] Feb 19 21:46:37 crc kubenswrapper[4795]: E0219 21:46:37.236072 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2a7b298-40b6-43b3-9099-ec74f2f0bfad" containerName="keystone-bootstrap" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.236089 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2a7b298-40b6-43b3-9099-ec74f2f0bfad" containerName="keystone-bootstrap" Feb 19 21:46:37 crc kubenswrapper[4795]: E0219 21:46:37.236116 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61f40e0-d6c3-49f7-a93f-d9956f086d4b" containerName="barbican-db-sync" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.236124 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61f40e0-d6c3-49f7-a93f-d9956f086d4b" containerName="barbican-db-sync" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.236289 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e61f40e0-d6c3-49f7-a93f-d9956f086d4b" containerName="barbican-db-sync" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.236302 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2a7b298-40b6-43b3-9099-ec74f2f0bfad" containerName="keystone-bootstrap" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.236817 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.250154 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8pz2x" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.250439 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.253506 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.253653 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.253940 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.256418 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.276133 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6945f64f65-rnq2b"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.403995 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-internal-tls-certs\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.404343 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-scripts\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.404452 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-credential-keys\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.404469 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-fernet-keys\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.404492 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8hb5\" (UniqueName: \"kubernetes.io/projected/4b928260-ac65-479d-bd4b-f14b48d24ddb-kube-api-access-v8hb5\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.404536 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-config-data\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.404552 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-public-tls-certs\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.404575 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-combined-ca-bundle\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.429275 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-54956bdb55-m77pq"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.431014 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.455183 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5d487967c6-th49z"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.458148 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.469472 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-54956bdb55-m77pq"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.472643 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xv49j" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.472783 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.472881 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.473074 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.476429 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d487967c6-th49z"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506046 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506094 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data-custom\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506121 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-credential-keys\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506135 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-fernet-keys\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506158 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8hb5\" (UniqueName: \"kubernetes.io/projected/4b928260-ac65-479d-bd4b-f14b48d24ddb-kube-api-access-v8hb5\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506209 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-combined-ca-bundle\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506251 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-config-data\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-public-tls-certs\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/159cf586-d51b-49e8-bea8-99af238b8a3e-logs\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506307 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-combined-ca-bundle\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506332 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-internal-tls-certs\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp762\" (UniqueName: \"kubernetes.io/projected/159cf586-d51b-49e8-bea8-99af238b8a3e-kube-api-access-cp762\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.506367 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-scripts\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.552029 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-combined-ca-bundle\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.552683 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-config-data\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.552994 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-credential-keys\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.553410 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-internal-tls-certs\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.557433 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-public-tls-certs\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.557848 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-scripts\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.567554 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-fernet-keys\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.611447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/159cf586-d51b-49e8-bea8-99af238b8a3e-logs\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.611506 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62013c47-67bc-44c5-a250-390102661c05-logs\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.611533 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp762\" (UniqueName: \"kubernetes.io/projected/159cf586-d51b-49e8-bea8-99af238b8a3e-kube-api-access-cp762\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.617654 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/159cf586-d51b-49e8-bea8-99af238b8a3e-logs\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.633049 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slh8s\" (UniqueName: \"kubernetes.io/projected/62013c47-67bc-44c5-a250-390102661c05-kube-api-access-slh8s\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.633191 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.633218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data-custom\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.633256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.633310 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data-custom\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.633399 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-combined-ca-bundle\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.633451 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-combined-ca-bundle\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.796546 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slh8s\" (UniqueName: \"kubernetes.io/projected/62013c47-67bc-44c5-a250-390102661c05-kube-api-access-slh8s\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.796613 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.796632 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data-custom\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.796702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-combined-ca-bundle\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.796783 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62013c47-67bc-44c5-a250-390102661c05-logs\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.797283 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62013c47-67bc-44c5-a250-390102661c05-logs\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.839231 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-vsr59"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.895581 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-xfpr5"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.897137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.903483 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5f5874b546-54bp8"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.904966 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.906530 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.910300 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-xfpr5"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.914931 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8hb5\" (UniqueName: \"kubernetes.io/projected/4b928260-ac65-479d-bd4b-f14b48d24ddb-kube-api-access-v8hb5\") pod \"keystone-6945f64f65-rnq2b\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.915609 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-combined-ca-bundle\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.916413 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.916752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data-custom\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.919046 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp762\" (UniqueName: \"kubernetes.io/projected/159cf586-d51b-49e8-bea8-99af238b8a3e-kube-api-access-cp762\") pod \"barbican-worker-54956bdb55-m77pq\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.919133 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-combined-ca-bundle\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.921257 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7cd95cf589-2gw48"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.924776 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.926719 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data-custom\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.929922 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.930838 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slh8s\" (UniqueName: \"kubernetes.io/projected/62013c47-67bc-44c5-a250-390102661c05-kube-api-access-slh8s\") pod \"barbican-keystone-listener-5d487967c6-th49z\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.932439 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6b6d98fbd4-svzc8"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.934133 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.941344 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f5874b546-54bp8"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.947264 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6b6d98fbd4-svzc8"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.955312 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7cd95cf589-2gw48"] Feb 19 21:46:37 crc kubenswrapper[4795]: I0219 21:46:37.999363 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-swift-storage-0\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.000365 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-nb\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.000403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-svc\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.000435 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-config\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.000457 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-sb\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.000488 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49w8w\" (UniqueName: \"kubernetes.io/projected/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-kube-api-access-49w8w\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.019142 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5fdbf9784-tjjsd"] Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.034405 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fdbf9784-tjjsd"] Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.034550 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.076285 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.082298 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103274 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6npn5\" (UniqueName: \"kubernetes.io/projected/250e9cae-06d9-44da-88af-239d15356a3c-kube-api-access-6npn5\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103334 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-svc\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103361 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-config\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103380 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4532c069-4eb7-48ab-b575-b6a130e2b438-logs\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgvdx\" (UniqueName: \"kubernetes.io/projected/c1ee7c17-521f-45b3-bdb4-748939838e60-kube-api-access-xgvdx\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103436 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4hjh\" (UniqueName: \"kubernetes.io/projected/4532c069-4eb7-48ab-b575-b6a130e2b438-kube-api-access-v4hjh\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103456 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-sb\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103488 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103504 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4krhj\" (UniqueName: \"kubernetes.io/projected/ff9dd6ee-d043-41af-bcfa-8385ae786038-kube-api-access-4krhj\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data-custom\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103543 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-combined-ca-bundle\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103585 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49w8w\" (UniqueName: \"kubernetes.io/projected/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-kube-api-access-49w8w\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-combined-ca-bundle\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103802 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103842 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data-custom\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103866 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103897 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-combined-ca-bundle\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.103986 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data-custom\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104090 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1ee7c17-521f-45b3-bdb4-748939838e60-logs\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104141 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-combined-ca-bundle\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104243 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104331 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff9dd6ee-d043-41af-bcfa-8385ae786038-logs\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104373 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-swift-storage-0\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/250e9cae-06d9-44da-88af-239d15356a3c-logs\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104429 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data-custom\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104479 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-nb\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104486 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-svc\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.104646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-config\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.105294 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-nb\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.105534 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-swift-storage-0\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.119014 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-sb\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.137843 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49w8w\" (UniqueName: \"kubernetes.io/projected/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-kube-api-access-49w8w\") pod \"dnsmasq-dns-9d49dd75f-xfpr5\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.187609 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.210474 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-combined-ca-bundle\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.210752 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.210789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff9dd6ee-d043-41af-bcfa-8385ae786038-logs\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.210841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/250e9cae-06d9-44da-88af-239d15356a3c-logs\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.210863 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data-custom\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.210923 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6npn5\" (UniqueName: \"kubernetes.io/projected/250e9cae-06d9-44da-88af-239d15356a3c-kube-api-access-6npn5\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.210992 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4532c069-4eb7-48ab-b575-b6a130e2b438-logs\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.211022 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgvdx\" (UniqueName: \"kubernetes.io/projected/c1ee7c17-521f-45b3-bdb4-748939838e60-kube-api-access-xgvdx\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.211054 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4hjh\" (UniqueName: \"kubernetes.io/projected/4532c069-4eb7-48ab-b575-b6a130e2b438-kube-api-access-v4hjh\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.212346 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff9dd6ee-d043-41af-bcfa-8385ae786038-logs\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.212759 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.212793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4krhj\" (UniqueName: \"kubernetes.io/projected/ff9dd6ee-d043-41af-bcfa-8385ae786038-kube-api-access-4krhj\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.212823 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data-custom\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.212857 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-combined-ca-bundle\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.212963 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-combined-ca-bundle\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.212996 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.213030 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data-custom\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.213065 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.213088 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-combined-ca-bundle\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.213338 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data-custom\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.213344 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/250e9cae-06d9-44da-88af-239d15356a3c-logs\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.213389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1ee7c17-521f-45b3-bdb4-748939838e60-logs\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.213950 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1ee7c17-521f-45b3-bdb4-748939838e60-logs\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.214315 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4532c069-4eb7-48ab-b575-b6a130e2b438-logs\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.215235 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76bfdcc9c4-d56mx" event={"ID":"bd5855e1-cadb-4170-8339-5f10945c6ce9","Type":"ContainerStarted","Data":"07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec"} Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.220300 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-combined-ca-bundle\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.223551 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data-custom\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.224323 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data-custom\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.225641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.226647 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-combined-ca-bundle\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.228326 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data-custom\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.229895 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-combined-ca-bundle\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.230215 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data-custom\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.236636 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4hjh\" (UniqueName: \"kubernetes.io/projected/4532c069-4eb7-48ab-b575-b6a130e2b438-kube-api-access-v4hjh\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.238263 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.238442 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.239267 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-combined-ca-bundle\") pod \"barbican-keystone-listener-6b6d98fbd4-svzc8\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.245787 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6npn5\" (UniqueName: \"kubernetes.io/projected/250e9cae-06d9-44da-88af-239d15356a3c-kube-api-access-6npn5\") pod \"barbican-worker-7cd95cf589-2gw48\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.246103 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" event={"ID":"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210","Type":"ContainerStarted","Data":"67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3"} Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.246560 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.247667 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" podUID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" containerName="dnsmasq-dns" containerID="cri-o://67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3" gracePeriod=10 Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.251028 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgvdx\" (UniqueName: \"kubernetes.io/projected/c1ee7c17-521f-45b3-bdb4-748939838e60-kube-api-access-xgvdx\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.252905 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4krhj\" (UniqueName: \"kubernetes.io/projected/ff9dd6ee-d043-41af-bcfa-8385ae786038-kube-api-access-4krhj\") pod \"barbican-api-5fdbf9784-tjjsd\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.273229 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data\") pod \"barbican-api-5f5874b546-54bp8\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.301939 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" podStartSLOduration=3.30190427 podStartE2EDuration="3.30190427s" podCreationTimestamp="2026-02-19 21:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:38.26998949 +0000 UTC m=+1109.462507354" watchObservedRunningTime="2026-02-19 21:46:38.30190427 +0000 UTC m=+1109.494422134" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.425368 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.434245 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.457104 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.485206 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.514763 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.653695 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-54956bdb55-m77pq"] Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.746768 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d487967c6-th49z"] Feb 19 21:46:38 crc kubenswrapper[4795]: I0219 21:46:38.935777 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6945f64f65-rnq2b"] Feb 19 21:46:38 crc kubenswrapper[4795]: W0219 21:46:38.955195 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b928260_ac65_479d_bd4b_f14b48d24ddb.slice/crio-bdfd7065946391c6d353cb9168c7226ec5dc670a459303e629e9266fcea79077 WatchSource:0}: Error finding container bdfd7065946391c6d353cb9168c7226ec5dc670a459303e629e9266fcea79077: Status 404 returned error can't find the container with id bdfd7065946391c6d353cb9168c7226ec5dc670a459303e629e9266fcea79077 Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.018082 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.157929 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-nb\") pod \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.157972 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-sb\") pod \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.158017 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-config\") pod \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.158077 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-swift-storage-0\") pod \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.158134 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-svc\") pod \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.158231 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rcql\" (UniqueName: \"kubernetes.io/projected/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-kube-api-access-9rcql\") pod \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\" (UID: \"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210\") " Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.165329 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-kube-api-access-9rcql" (OuterVolumeSpecName: "kube-api-access-9rcql") pod "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" (UID: "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210"). InnerVolumeSpecName "kube-api-access-9rcql". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.210829 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-xfpr5"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.231904 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f5874b546-54bp8"] Feb 19 21:46:39 crc kubenswrapper[4795]: W0219 21:46:39.246076 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ae0e35c_6ee4_4e25_a76d_7033c2a3f09b.slice/crio-e42790fe679a04fe8e1a343d997c7b72c08e413899c7c4625b70ee788a6866db WatchSource:0}: Error finding container e42790fe679a04fe8e1a343d997c7b72c08e413899c7c4625b70ee788a6866db: Status 404 returned error can't find the container with id e42790fe679a04fe8e1a343d997c7b72c08e413899c7c4625b70ee788a6866db Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.303498 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rcql\" (UniqueName: \"kubernetes.io/projected/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-kube-api-access-9rcql\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.316344 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" (UID: "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.329856 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-config" (OuterVolumeSpecName: "config") pod "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" (UID: "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.354447 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" event={"ID":"62013c47-67bc-44c5-a250-390102661c05","Type":"ContainerStarted","Data":"03d4ba2c4c2e12794fc2992e51f7d78cc75a2714702ce6582a4cf8391b4968e8"} Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.366540 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" (UID: "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.368871 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" (UID: "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.370594 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76bfdcc9c4-d56mx" event={"ID":"bd5855e1-cadb-4170-8339-5f10945c6ce9","Type":"ContainerStarted","Data":"79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8"} Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.370823 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.370852 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.373042 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" event={"ID":"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b","Type":"ContainerStarted","Data":"e42790fe679a04fe8e1a343d997c7b72c08e413899c7c4625b70ee788a6866db"} Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.375754 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5fdbf9784-tjjsd"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.384633 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" (UID: "1b5d1bab-ed1a-4a5e-a194-ad6b25d16210"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.398011 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54956bdb55-m77pq" event={"ID":"159cf586-d51b-49e8-bea8-99af238b8a3e","Type":"ContainerStarted","Data":"4e28f3e465acd4c6f672fce5b1b114fcf10939feb9816a3820f5b6f281f46c1c"} Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.407142 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.407189 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.407198 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.407207 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.407216 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.410392 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7cd95cf589-2gw48"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.411245 4795 generic.go:334] "Generic (PLEG): container finished" podID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" containerID="67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3" exitCode=0 Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.411353 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.411360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" event={"ID":"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210","Type":"ContainerDied","Data":"67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3"} Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.411397 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-vsr59" event={"ID":"1b5d1bab-ed1a-4a5e-a194-ad6b25d16210","Type":"ContainerDied","Data":"d1d8cd14d0dca706ef7e60716badbe8723a3cc32c8f0ce72a5ad49bcb63f1b70"} Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.411416 4795 scope.go:117] "RemoveContainer" containerID="67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.420428 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-76bfdcc9c4-d56mx" podStartSLOduration=3.420403535 podStartE2EDuration="3.420403535s" podCreationTimestamp="2026-02-19 21:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:39.391466859 +0000 UTC m=+1110.583984733" watchObservedRunningTime="2026-02-19 21:46:39.420403535 +0000 UTC m=+1110.612921399" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.422887 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6945f64f65-rnq2b" event={"ID":"4b928260-ac65-479d-bd4b-f14b48d24ddb","Type":"ContainerStarted","Data":"bdfd7065946391c6d353cb9168c7226ec5dc670a459303e629e9266fcea79077"} Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.448156 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6b6d98fbd4-svzc8"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.478341 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-vsr59"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.487983 4795 scope.go:117] "RemoveContainer" containerID="c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.492855 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-vsr59"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.535457 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" path="/var/lib/kubelet/pods/1b5d1bab-ed1a-4a5e-a194-ad6b25d16210/volumes" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.536031 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-576c65f985-r97z7"] Feb 19 21:46:39 crc kubenswrapper[4795]: E0219 21:46:39.536443 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" containerName="dnsmasq-dns" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.536459 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" containerName="dnsmasq-dns" Feb 19 21:46:39 crc kubenswrapper[4795]: E0219 21:46:39.536512 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" containerName="init" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.536521 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" containerName="init" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.536786 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5d1bab-ed1a-4a5e-a194-ad6b25d16210" containerName="dnsmasq-dns" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.537769 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.551042 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.552508 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-576c65f985-r97z7"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.551136 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.570347 4795 scope.go:117] "RemoveContainer" containerID="67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3" Feb 19 21:46:39 crc kubenswrapper[4795]: E0219 21:46:39.571365 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3\": container with ID starting with 67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3 not found: ID does not exist" containerID="67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.571470 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3"} err="failed to get container status \"67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3\": rpc error: code = NotFound desc = could not find container \"67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3\": container with ID starting with 67ae128465c4be3388435d97fad254a20c1f3ba339605e04ac27d63cac4910f3 not found: ID does not exist" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.571558 4795 scope.go:117] "RemoveContainer" containerID="c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185" Feb 19 21:46:39 crc kubenswrapper[4795]: E0219 21:46:39.575383 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185\": container with ID starting with c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185 not found: ID does not exist" containerID="c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.575428 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185"} err="failed to get container status \"c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185\": rpc error: code = NotFound desc = could not find container \"c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185\": container with ID starting with c0639f1ced9910964acb68b012da7b3244f83a7c9a5d9a111a018fafc99f2185 not found: ID does not exist" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.655092 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6df95dfbd4-ftf6x"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.674650 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.675961 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6df95dfbd4-ftf6x"] Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.714261 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-ovndb-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.714302 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-public-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.714322 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-internal-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.714348 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-httpd-config\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.714376 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dln52\" (UniqueName: \"kubernetes.io/projected/30f2c894-7a7a-4e5a-a090-a28ab50c766a-kube-api-access-dln52\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.714414 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-config\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.714462 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-combined-ca-bundle\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.816213 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-public-tls-certs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.817562 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-config\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.817609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-internal-tls-certs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.817651 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-config-data\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.817712 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-combined-ca-bundle\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.817756 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-combined-ca-bundle\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.817877 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-scripts\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.818135 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84vz2\" (UniqueName: \"kubernetes.io/projected/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-kube-api-access-84vz2\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.818224 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-ovndb-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.818298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-logs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.818350 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-public-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.818373 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-internal-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.818404 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-httpd-config\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.818437 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dln52\" (UniqueName: \"kubernetes.io/projected/30f2c894-7a7a-4e5a-a090-a28ab50c766a-kube-api-access-dln52\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.838733 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-public-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.838794 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-httpd-config\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.838844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-config\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.839480 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-internal-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.841846 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dln52\" (UniqueName: \"kubernetes.io/projected/30f2c894-7a7a-4e5a-a090-a28ab50c766a-kube-api-access-dln52\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.848027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-combined-ca-bundle\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.872910 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-ovndb-tls-certs\") pod \"neutron-576c65f985-r97z7\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.920587 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-public-tls-certs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.920659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-internal-tls-certs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.920700 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-config-data\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.920748 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-combined-ca-bundle\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.920783 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-scripts\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.920820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84vz2\" (UniqueName: \"kubernetes.io/projected/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-kube-api-access-84vz2\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.921298 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-logs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.921857 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-logs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.926901 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-public-tls-certs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.932465 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-scripts\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.933502 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-internal-tls-certs\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.940954 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-config-data\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.945851 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-combined-ca-bundle\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:39 crc kubenswrapper[4795]: I0219 21:46:39.947485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84vz2\" (UniqueName: \"kubernetes.io/projected/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-kube-api-access-84vz2\") pod \"placement-6df95dfbd4-ftf6x\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.021698 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.058143 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.435081 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.435140 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.454844 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f5874b546-54bp8" event={"ID":"c1ee7c17-521f-45b3-bdb4-748939838e60","Type":"ContainerStarted","Data":"d05922b241c5606cc7cb67f0611459164ffa20eef296f34ff91982e9383ca569"} Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.455354 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f5874b546-54bp8" event={"ID":"c1ee7c17-521f-45b3-bdb4-748939838e60","Type":"ContainerStarted","Data":"be89fa2e2a317aa99f43093ff5681ba4ce22ead52fd8adae9138024217c6894f"} Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.458120 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" event={"ID":"4532c069-4eb7-48ab-b575-b6a130e2b438","Type":"ContainerStarted","Data":"c0b18fd78cd092c133f6dd779fd8c2b41870a6c99e45b8bcd625ff594cb4d9de"} Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.460013 4795 generic.go:334] "Generic (PLEG): container finished" podID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" containerID="81fbd576f753564d8cc96fbaca875356e50537cd709002f5a9b77dc1f35e7279" exitCode=0 Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.460094 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" event={"ID":"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b","Type":"ContainerDied","Data":"81fbd576f753564d8cc96fbaca875356e50537cd709002f5a9b77dc1f35e7279"} Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.461237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd95cf589-2gw48" event={"ID":"250e9cae-06d9-44da-88af-239d15356a3c","Type":"ContainerStarted","Data":"26ff50c7b1851e9704bfa4221d66176820b3417a16cff032c2c82bc2945df7a8"} Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.472875 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.473807 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.473857 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6945f64f65-rnq2b" event={"ID":"4b928260-ac65-479d-bd4b-f14b48d24ddb","Type":"ContainerStarted","Data":"77a5c881bfb4b3162733203d6d06eed351c1cde8f2967461288b978f94eeb5ba"} Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.473894 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.476212 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fdbf9784-tjjsd" event={"ID":"ff9dd6ee-d043-41af-bcfa-8385ae786038","Type":"ContainerStarted","Data":"628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451"} Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.476256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fdbf9784-tjjsd" event={"ID":"ff9dd6ee-d043-41af-bcfa-8385ae786038","Type":"ContainerStarted","Data":"86ab35cfd9c80c07dee3068dbe14e2a6dadda5cbd9e0d550648a5054e671ff43"} Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.488460 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.510243 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.519705 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6945f64f65-rnq2b" podStartSLOduration=3.519685517 podStartE2EDuration="3.519685517s" podCreationTimestamp="2026-02-19 21:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:40.505105146 +0000 UTC m=+1111.697623020" watchObservedRunningTime="2026-02-19 21:46:40.519685517 +0000 UTC m=+1111.712203381" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.534956 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 21:46:40 crc kubenswrapper[4795]: I0219 21:46:40.563506 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 21:46:41 crc kubenswrapper[4795]: I0219 21:46:41.486431 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 21:46:41 crc kubenswrapper[4795]: I0219 21:46:41.486471 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 21:46:41 crc kubenswrapper[4795]: I0219 21:46:41.486496 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:41 crc kubenswrapper[4795]: I0219 21:46:41.486780 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.176306 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5f5874b546-54bp8"] Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.204922 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6f98bf9994-pr48x"] Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.206309 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.208312 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.212877 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.234186 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f98bf9994-pr48x"] Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.366800 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6pv9\" (UniqueName: \"kubernetes.io/projected/e4a6a069-904a-4072-b98c-346f67f22def-kube-api-access-h6pv9\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.367108 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a069-904a-4072-b98c-346f67f22def-logs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.367226 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.367316 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data-custom\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.367385 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-public-tls-certs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.367456 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-combined-ca-bundle\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.367546 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-internal-tls-certs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.469264 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-internal-tls-certs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.469371 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6pv9\" (UniqueName: \"kubernetes.io/projected/e4a6a069-904a-4072-b98c-346f67f22def-kube-api-access-h6pv9\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.469467 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a069-904a-4072-b98c-346f67f22def-logs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.469496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.469525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data-custom\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.469544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-public-tls-certs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.469571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-combined-ca-bundle\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.470422 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a069-904a-4072-b98c-346f67f22def-logs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.476369 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-public-tls-certs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.477056 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-combined-ca-bundle\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.479305 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-internal-tls-certs\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.482466 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data-custom\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.483539 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.490843 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6pv9\" (UniqueName: \"kubernetes.io/projected/e4a6a069-904a-4072-b98c-346f67f22def-kube-api-access-h6pv9\") pod \"barbican-api-6f98bf9994-pr48x\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:42 crc kubenswrapper[4795]: I0219 21:46:42.525157 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:43 crc kubenswrapper[4795]: I0219 21:46:43.506579 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:46:43 crc kubenswrapper[4795]: I0219 21:46:43.507074 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:46:43 crc kubenswrapper[4795]: I0219 21:46:43.653086 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 21:46:43 crc kubenswrapper[4795]: I0219 21:46:43.660694 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:43 crc kubenswrapper[4795]: I0219 21:46:43.660825 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:46:43 crc kubenswrapper[4795]: I0219 21:46:43.737129 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6df95dfbd4-ftf6x"] Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.016873 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.033823 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f98bf9994-pr48x"] Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.123936 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.299678 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-576c65f985-r97z7"] Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.598919 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd95cf589-2gw48" event={"ID":"250e9cae-06d9-44da-88af-239d15356a3c","Type":"ContainerStarted","Data":"0d1ad96e830846d9034790e753be1811679f1bbf15af5f12e70081c7ae374cfd"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.600673 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fdbf9784-tjjsd" event={"ID":"ff9dd6ee-d043-41af-bcfa-8385ae786038","Type":"ContainerStarted","Data":"1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.602424 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.602443 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.607471 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f5874b546-54bp8" event={"ID":"c1ee7c17-521f-45b3-bdb4-748939838e60","Type":"ContainerStarted","Data":"bb53ff7aacbd051823727a3e5e9c0df1aa87aded1e6321d5bbd77a8a437a808e"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.607693 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5f5874b546-54bp8" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api-log" containerID="cri-o://d05922b241c5606cc7cb67f0611459164ffa20eef296f34ff91982e9383ca569" gracePeriod=30 Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.607804 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5f5874b546-54bp8" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api" containerID="cri-o://bb53ff7aacbd051823727a3e5e9c0df1aa87aded1e6321d5bbd77a8a437a808e" gracePeriod=30 Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.607898 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.607913 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.617364 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f98bf9994-pr48x" event={"ID":"e4a6a069-904a-4072-b98c-346f67f22def","Type":"ContainerStarted","Data":"95eca73f9943de18ca7dd19f1ef5d95e39ab42d81563dce332afbfa7377d20f4"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.619944 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6df95dfbd4-ftf6x" event={"ID":"b01bcd5b-435a-4702-b0a4-8dfe8f553c23","Type":"ContainerStarted","Data":"16d1434f729c32f7f5af098d1664dafb8ed3d4636079462bf6c45b1454ee08ef"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.619981 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6df95dfbd4-ftf6x" event={"ID":"b01bcd5b-435a-4702-b0a4-8dfe8f553c23","Type":"ContainerStarted","Data":"47caa9a7519cc7b778b03d7e938c02973816e703da61178a9af0e7d1bdc77812"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.639704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerStarted","Data":"f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.656253 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5fdbf9784-tjjsd" podStartSLOduration=7.656226029 podStartE2EDuration="7.656226029s" podCreationTimestamp="2026-02-19 21:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:44.637029677 +0000 UTC m=+1115.829547541" watchObservedRunningTime="2026-02-19 21:46:44.656226029 +0000 UTC m=+1115.848743893" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.661615 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5f5874b546-54bp8" podStartSLOduration=7.66159949 podStartE2EDuration="7.66159949s" podCreationTimestamp="2026-02-19 21:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:44.657193726 +0000 UTC m=+1115.849711590" watchObservedRunningTime="2026-02-19 21:46:44.66159949 +0000 UTC m=+1115.854117354" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.667955 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" event={"ID":"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b","Type":"ContainerStarted","Data":"af8b691ea9f453a749ca6ae641397eb75df993ab21c7f57160c14bfd356a318c"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.668877 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.704829 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5f5874b546-54bp8" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": EOF" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.709265 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54956bdb55-m77pq" event={"ID":"159cf586-d51b-49e8-bea8-99af238b8a3e","Type":"ContainerStarted","Data":"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.710492 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" podStartSLOduration=7.710475729 podStartE2EDuration="7.710475729s" podCreationTimestamp="2026-02-19 21:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:44.70912437 +0000 UTC m=+1115.901642254" watchObservedRunningTime="2026-02-19 21:46:44.710475729 +0000 UTC m=+1115.902993593" Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.718800 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576c65f985-r97z7" event={"ID":"30f2c894-7a7a-4e5a-a090-a28ab50c766a","Type":"ContainerStarted","Data":"cc411b717439dc2d51f309775cfcf3728048016bc68869b8b28221a90840d6fb"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.747784 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" event={"ID":"4532c069-4eb7-48ab-b575-b6a130e2b438","Type":"ContainerStarted","Data":"374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef"} Feb 19 21:46:44 crc kubenswrapper[4795]: I0219 21:46:44.750062 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" event={"ID":"62013c47-67bc-44c5-a250-390102661c05","Type":"ContainerStarted","Data":"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.762876 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6df95dfbd4-ftf6x" event={"ID":"b01bcd5b-435a-4702-b0a4-8dfe8f553c23","Type":"ContainerStarted","Data":"9d2e881624f7800e5e6e027e5487084188713e96e14cab5c281c53ae829851a2"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.764533 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.764561 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.780276 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576c65f985-r97z7" event={"ID":"30f2c894-7a7a-4e5a-a090-a28ab50c766a","Type":"ContainerStarted","Data":"fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.780340 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576c65f985-r97z7" event={"ID":"30f2c894-7a7a-4e5a-a090-a28ab50c766a","Type":"ContainerStarted","Data":"b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.780390 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.796643 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54956bdb55-m77pq" event={"ID":"159cf586-d51b-49e8-bea8-99af238b8a3e","Type":"ContainerStarted","Data":"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.800157 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerID="d05922b241c5606cc7cb67f0611459164ffa20eef296f34ff91982e9383ca569" exitCode=143 Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.800221 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f5874b546-54bp8" event={"ID":"c1ee7c17-521f-45b3-bdb4-748939838e60","Type":"ContainerDied","Data":"d05922b241c5606cc7cb67f0611459164ffa20eef296f34ff91982e9383ca569"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.802100 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" event={"ID":"4532c069-4eb7-48ab-b575-b6a130e2b438","Type":"ContainerStarted","Data":"b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.803445 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f98bf9994-pr48x" event={"ID":"e4a6a069-904a-4072-b98c-346f67f22def","Type":"ContainerStarted","Data":"a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.803540 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f98bf9994-pr48x" event={"ID":"e4a6a069-904a-4072-b98c-346f67f22def","Type":"ContainerStarted","Data":"d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.803619 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.803680 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.806212 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd95cf589-2gw48" event={"ID":"250e9cae-06d9-44da-88af-239d15356a3c","Type":"ContainerStarted","Data":"f1227cc577cca2da8d7067560947f94ac089651b67e263368202e928196a7bc8"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.815518 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6df95dfbd4-ftf6x" podStartSLOduration=6.815499723 podStartE2EDuration="6.815499723s" podCreationTimestamp="2026-02-19 21:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:45.79303627 +0000 UTC m=+1116.985554144" watchObservedRunningTime="2026-02-19 21:46:45.815499723 +0000 UTC m=+1117.008017587" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.819208 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-576c65f985-r97z7" podStartSLOduration=6.819198288 podStartE2EDuration="6.819198288s" podCreationTimestamp="2026-02-19 21:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:45.811618654 +0000 UTC m=+1117.004136518" watchObservedRunningTime="2026-02-19 21:46:45.819198288 +0000 UTC m=+1117.011716152" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.830025 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-54956bdb55-m77pq" podStartSLOduration=4.027117709 podStartE2EDuration="8.830010173s" podCreationTimestamp="2026-02-19 21:46:37 +0000 UTC" firstStartedPulling="2026-02-19 21:46:38.697210499 +0000 UTC m=+1109.889728363" lastFinishedPulling="2026-02-19 21:46:43.500102963 +0000 UTC m=+1114.692620827" observedRunningTime="2026-02-19 21:46:45.826535355 +0000 UTC m=+1117.019053219" watchObservedRunningTime="2026-02-19 21:46:45.830010173 +0000 UTC m=+1117.022528037" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.830438 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" event={"ID":"62013c47-67bc-44c5-a250-390102661c05","Type":"ContainerStarted","Data":"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.845916 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zjbsw" event={"ID":"5084e7b9-4923-449e-b0d7-28c602faeff0","Type":"ContainerStarted","Data":"0852bee2926e9dad886249433c004e0f4241817093b65fb2480708d4c6c502c0"} Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.885579 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" podStartSLOduration=4.749627204 podStartE2EDuration="8.885560869s" podCreationTimestamp="2026-02-19 21:46:37 +0000 UTC" firstStartedPulling="2026-02-19 21:46:39.446507331 +0000 UTC m=+1110.639025195" lastFinishedPulling="2026-02-19 21:46:43.582440996 +0000 UTC m=+1114.774958860" observedRunningTime="2026-02-19 21:46:45.846064695 +0000 UTC m=+1117.038582549" watchObservedRunningTime="2026-02-19 21:46:45.885560869 +0000 UTC m=+1117.078078733" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.894911 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7cd95cf589-2gw48" podStartSLOduration=4.751367894 podStartE2EDuration="8.894893313s" podCreationTimestamp="2026-02-19 21:46:37 +0000 UTC" firstStartedPulling="2026-02-19 21:46:39.424741827 +0000 UTC m=+1110.617259691" lastFinishedPulling="2026-02-19 21:46:43.568267246 +0000 UTC m=+1114.760785110" observedRunningTime="2026-02-19 21:46:45.865066751 +0000 UTC m=+1117.057584605" watchObservedRunningTime="2026-02-19 21:46:45.894893313 +0000 UTC m=+1117.087411177" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.924186 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5d487967c6-th49z"] Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.930028 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6f98bf9994-pr48x" podStartSLOduration=3.930000943 podStartE2EDuration="3.930000943s" podCreationTimestamp="2026-02-19 21:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:45.892255948 +0000 UTC m=+1117.084773812" watchObservedRunningTime="2026-02-19 21:46:45.930000943 +0000 UTC m=+1117.122518817" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.974788 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-54956bdb55-m77pq"] Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.986024 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-zjbsw" podStartSLOduration=3.622193946 podStartE2EDuration="36.986003302s" podCreationTimestamp="2026-02-19 21:46:09 +0000 UTC" firstStartedPulling="2026-02-19 21:46:10.219688239 +0000 UTC m=+1081.412206103" lastFinishedPulling="2026-02-19 21:46:43.583497595 +0000 UTC m=+1114.776015459" observedRunningTime="2026-02-19 21:46:45.921925185 +0000 UTC m=+1117.114443049" watchObservedRunningTime="2026-02-19 21:46:45.986003302 +0000 UTC m=+1117.178521166" Feb 19 21:46:45 crc kubenswrapper[4795]: I0219 21:46:45.993604 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" podStartSLOduration=4.3497769680000005 podStartE2EDuration="8.993591346s" podCreationTimestamp="2026-02-19 21:46:37 +0000 UTC" firstStartedPulling="2026-02-19 21:46:38.828525432 +0000 UTC m=+1110.021043296" lastFinishedPulling="2026-02-19 21:46:43.47233981 +0000 UTC m=+1114.664857674" observedRunningTime="2026-02-19 21:46:45.948568786 +0000 UTC m=+1117.141086650" watchObservedRunningTime="2026-02-19 21:46:45.993591346 +0000 UTC m=+1117.186109210" Feb 19 21:46:46 crc kubenswrapper[4795]: I0219 21:46:46.854705 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:46:46 crc kubenswrapper[4795]: I0219 21:46:46.921507 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:47 crc kubenswrapper[4795]: I0219 21:46:47.862266 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-54956bdb55-m77pq" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerName="barbican-worker-log" containerID="cri-o://6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab" gracePeriod=30 Feb 19 21:46:47 crc kubenswrapper[4795]: I0219 21:46:47.862595 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-54956bdb55-m77pq" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerName="barbican-worker" containerID="cri-o://8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9" gracePeriod=30 Feb 19 21:46:47 crc kubenswrapper[4795]: I0219 21:46:47.863111 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" podUID="62013c47-67bc-44c5-a250-390102661c05" containerName="barbican-keystone-listener-log" containerID="cri-o://c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b" gracePeriod=30 Feb 19 21:46:47 crc kubenswrapper[4795]: I0219 21:46:47.863220 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" podUID="62013c47-67bc-44c5-a250-390102661c05" containerName="barbican-keystone-listener" containerID="cri-o://5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075" gracePeriod=30 Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.622912 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.715092 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.776980 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-combined-ca-bundle\") pod \"62013c47-67bc-44c5-a250-390102661c05\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777028 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slh8s\" (UniqueName: \"kubernetes.io/projected/62013c47-67bc-44c5-a250-390102661c05-kube-api-access-slh8s\") pod \"62013c47-67bc-44c5-a250-390102661c05\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777053 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp762\" (UniqueName: \"kubernetes.io/projected/159cf586-d51b-49e8-bea8-99af238b8a3e-kube-api-access-cp762\") pod \"159cf586-d51b-49e8-bea8-99af238b8a3e\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777082 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data\") pod \"159cf586-d51b-49e8-bea8-99af238b8a3e\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777100 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data-custom\") pod \"159cf586-d51b-49e8-bea8-99af238b8a3e\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777120 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data\") pod \"62013c47-67bc-44c5-a250-390102661c05\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777188 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/159cf586-d51b-49e8-bea8-99af238b8a3e-logs\") pod \"159cf586-d51b-49e8-bea8-99af238b8a3e\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777215 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data-custom\") pod \"62013c47-67bc-44c5-a250-390102661c05\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777258 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62013c47-67bc-44c5-a250-390102661c05-logs\") pod \"62013c47-67bc-44c5-a250-390102661c05\" (UID: \"62013c47-67bc-44c5-a250-390102661c05\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.777285 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-combined-ca-bundle\") pod \"159cf586-d51b-49e8-bea8-99af238b8a3e\" (UID: \"159cf586-d51b-49e8-bea8-99af238b8a3e\") " Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.779027 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62013c47-67bc-44c5-a250-390102661c05-logs" (OuterVolumeSpecName: "logs") pod "62013c47-67bc-44c5-a250-390102661c05" (UID: "62013c47-67bc-44c5-a250-390102661c05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.779221 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/159cf586-d51b-49e8-bea8-99af238b8a3e-logs" (OuterVolumeSpecName: "logs") pod "159cf586-d51b-49e8-bea8-99af238b8a3e" (UID: "159cf586-d51b-49e8-bea8-99af238b8a3e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.785523 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "159cf586-d51b-49e8-bea8-99af238b8a3e" (UID: "159cf586-d51b-49e8-bea8-99af238b8a3e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.786580 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62013c47-67bc-44c5-a250-390102661c05-kube-api-access-slh8s" (OuterVolumeSpecName: "kube-api-access-slh8s") pod "62013c47-67bc-44c5-a250-390102661c05" (UID: "62013c47-67bc-44c5-a250-390102661c05"). InnerVolumeSpecName "kube-api-access-slh8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.790889 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "62013c47-67bc-44c5-a250-390102661c05" (UID: "62013c47-67bc-44c5-a250-390102661c05"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.792377 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/159cf586-d51b-49e8-bea8-99af238b8a3e-kube-api-access-cp762" (OuterVolumeSpecName: "kube-api-access-cp762") pod "159cf586-d51b-49e8-bea8-99af238b8a3e" (UID: "159cf586-d51b-49e8-bea8-99af238b8a3e"). InnerVolumeSpecName "kube-api-access-cp762". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.822725 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62013c47-67bc-44c5-a250-390102661c05" (UID: "62013c47-67bc-44c5-a250-390102661c05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.822855 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "159cf586-d51b-49e8-bea8-99af238b8a3e" (UID: "159cf586-d51b-49e8-bea8-99af238b8a3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.840925 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data" (OuterVolumeSpecName: "config-data") pod "62013c47-67bc-44c5-a250-390102661c05" (UID: "62013c47-67bc-44c5-a250-390102661c05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.863852 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data" (OuterVolumeSpecName: "config-data") pod "159cf586-d51b-49e8-bea8-99af238b8a3e" (UID: "159cf586-d51b-49e8-bea8-99af238b8a3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.872966 4795 generic.go:334] "Generic (PLEG): container finished" podID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerID="8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9" exitCode=0 Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.873009 4795 generic.go:334] "Generic (PLEG): container finished" podID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerID="6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab" exitCode=143 Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.873051 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54956bdb55-m77pq" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.873045 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54956bdb55-m77pq" event={"ID":"159cf586-d51b-49e8-bea8-99af238b8a3e","Type":"ContainerDied","Data":"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9"} Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.873137 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54956bdb55-m77pq" event={"ID":"159cf586-d51b-49e8-bea8-99af238b8a3e","Type":"ContainerDied","Data":"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab"} Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.873150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54956bdb55-m77pq" event={"ID":"159cf586-d51b-49e8-bea8-99af238b8a3e","Type":"ContainerDied","Data":"4e28f3e465acd4c6f672fce5b1b114fcf10939feb9816a3820f5b6f281f46c1c"} Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.873179 4795 scope.go:117] "RemoveContainer" containerID="8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.878984 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879005 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62013c47-67bc-44c5-a250-390102661c05-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879015 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879023 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879032 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slh8s\" (UniqueName: \"kubernetes.io/projected/62013c47-67bc-44c5-a250-390102661c05-kube-api-access-slh8s\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879041 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp762\" (UniqueName: \"kubernetes.io/projected/159cf586-d51b-49e8-bea8-99af238b8a3e-kube-api-access-cp762\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879052 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879061 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/159cf586-d51b-49e8-bea8-99af238b8a3e-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879071 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62013c47-67bc-44c5-a250-390102661c05-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.879080 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/159cf586-d51b-49e8-bea8-99af238b8a3e-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.880701 4795 generic.go:334] "Generic (PLEG): container finished" podID="62013c47-67bc-44c5-a250-390102661c05" containerID="5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075" exitCode=0 Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.880732 4795 generic.go:334] "Generic (PLEG): container finished" podID="62013c47-67bc-44c5-a250-390102661c05" containerID="c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b" exitCode=143 Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.880756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" event={"ID":"62013c47-67bc-44c5-a250-390102661c05","Type":"ContainerDied","Data":"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075"} Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.880773 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.880788 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" event={"ID":"62013c47-67bc-44c5-a250-390102661c05","Type":"ContainerDied","Data":"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b"} Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.880934 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d487967c6-th49z" event={"ID":"62013c47-67bc-44c5-a250-390102661c05","Type":"ContainerDied","Data":"03d4ba2c4c2e12794fc2992e51f7d78cc75a2714702ce6582a4cf8391b4968e8"} Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.907097 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-54956bdb55-m77pq"] Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.918075 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-54956bdb55-m77pq"] Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.922996 4795 scope.go:117] "RemoveContainer" containerID="6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.926059 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5d487967c6-th49z"] Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.935093 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-5d487967c6-th49z"] Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.956947 4795 scope.go:117] "RemoveContainer" containerID="8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9" Feb 19 21:46:48 crc kubenswrapper[4795]: E0219 21:46:48.957391 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9\": container with ID starting with 8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9 not found: ID does not exist" containerID="8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.957436 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9"} err="failed to get container status \"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9\": rpc error: code = NotFound desc = could not find container \"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9\": container with ID starting with 8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9 not found: ID does not exist" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.957465 4795 scope.go:117] "RemoveContainer" containerID="6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab" Feb 19 21:46:48 crc kubenswrapper[4795]: E0219 21:46:48.958065 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab\": container with ID starting with 6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab not found: ID does not exist" containerID="6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.958116 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab"} err="failed to get container status \"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab\": rpc error: code = NotFound desc = could not find container \"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab\": container with ID starting with 6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab not found: ID does not exist" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.958144 4795 scope.go:117] "RemoveContainer" containerID="8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.958589 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9"} err="failed to get container status \"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9\": rpc error: code = NotFound desc = could not find container \"8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9\": container with ID starting with 8c4602ad6d25e5ebb7ae1f04d551006252bfd7f7785df3a87b8f2e2f91d7cae9 not found: ID does not exist" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.958631 4795 scope.go:117] "RemoveContainer" containerID="6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.959057 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab"} err="failed to get container status \"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab\": rpc error: code = NotFound desc = could not find container \"6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab\": container with ID starting with 6899d2d5e3605ac4b85284744881c940882947864f30c85a559f8e727b8425ab not found: ID does not exist" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.959102 4795 scope.go:117] "RemoveContainer" containerID="5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075" Feb 19 21:46:48 crc kubenswrapper[4795]: I0219 21:46:48.980337 4795 scope.go:117] "RemoveContainer" containerID="c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.044889 4795 scope.go:117] "RemoveContainer" containerID="5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075" Feb 19 21:46:49 crc kubenswrapper[4795]: E0219 21:46:49.045350 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075\": container with ID starting with 5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075 not found: ID does not exist" containerID="5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.045379 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075"} err="failed to get container status \"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075\": rpc error: code = NotFound desc = could not find container \"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075\": container with ID starting with 5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075 not found: ID does not exist" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.045399 4795 scope.go:117] "RemoveContainer" containerID="c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b" Feb 19 21:46:49 crc kubenswrapper[4795]: E0219 21:46:49.045737 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b\": container with ID starting with c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b not found: ID does not exist" containerID="c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.045766 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b"} err="failed to get container status \"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b\": rpc error: code = NotFound desc = could not find container \"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b\": container with ID starting with c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b not found: ID does not exist" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.045779 4795 scope.go:117] "RemoveContainer" containerID="5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.046058 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075"} err="failed to get container status \"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075\": rpc error: code = NotFound desc = could not find container \"5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075\": container with ID starting with 5dce5b8835b0e39dba2d9477235497193e003011caeae9a1bc62a747113e3075 not found: ID does not exist" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.046089 4795 scope.go:117] "RemoveContainer" containerID="c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.046252 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b"} err="failed to get container status \"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b\": rpc error: code = NotFound desc = could not find container \"c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b\": container with ID starting with c702260ef9d1c33ae13f9d6a85037d4106df071e3becab6adaf7dc18f0d4998b not found: ID does not exist" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.525261 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" path="/var/lib/kubelet/pods/159cf586-d51b-49e8-bea8-99af238b8a3e/volumes" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.526097 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62013c47-67bc-44c5-a250-390102661c05" path="/var/lib/kubelet/pods/62013c47-67bc-44c5-a250-390102661c05/volumes" Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.915385 4795 generic.go:334] "Generic (PLEG): container finished" podID="5084e7b9-4923-449e-b0d7-28c602faeff0" containerID="0852bee2926e9dad886249433c004e0f4241817093b65fb2480708d4c6c502c0" exitCode=0 Feb 19 21:46:49 crc kubenswrapper[4795]: I0219 21:46:49.915479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zjbsw" event={"ID":"5084e7b9-4923-449e-b0d7-28c602faeff0","Type":"ContainerDied","Data":"0852bee2926e9dad886249433c004e0f4241817093b65fb2480708d4c6c502c0"} Feb 19 21:46:51 crc kubenswrapper[4795]: I0219 21:46:51.112437 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5f5874b546-54bp8" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:37872->10.217.0.158:9311: read: connection reset by peer" Feb 19 21:46:51 crc kubenswrapper[4795]: I0219 21:46:51.941272 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerID="bb53ff7aacbd051823727a3e5e9c0df1aa87aded1e6321d5bbd77a8a437a808e" exitCode=0 Feb 19 21:46:51 crc kubenswrapper[4795]: I0219 21:46:51.941340 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f5874b546-54bp8" event={"ID":"c1ee7c17-521f-45b3-bdb4-748939838e60","Type":"ContainerDied","Data":"bb53ff7aacbd051823727a3e5e9c0df1aa87aded1e6321d5bbd77a8a437a808e"} Feb 19 21:46:53 crc kubenswrapper[4795]: I0219 21:46:53.427468 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:46:53 crc kubenswrapper[4795]: I0219 21:46:53.435520 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5f5874b546-54bp8" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": dial tcp 10.217.0.158:9311: connect: connection refused" Feb 19 21:46:53 crc kubenswrapper[4795]: I0219 21:46:53.435763 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5f5874b546-54bp8" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": dial tcp 10.217.0.158:9311: connect: connection refused" Feb 19 21:46:53 crc kubenswrapper[4795]: I0219 21:46:53.593588 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-rxl4z"] Feb 19 21:46:53 crc kubenswrapper[4795]: I0219 21:46:53.594134 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" podUID="db8db625-527b-49de-bab0-c2065360d792" containerName="dnsmasq-dns" containerID="cri-o://c943900c340586ccd6fb6dfea1e9575c93f20a12cfb4d8dd7d04eaf8b69bb4cb" gracePeriod=10 Feb 19 21:46:53 crc kubenswrapper[4795]: I0219 21:46:53.962411 4795 generic.go:334] "Generic (PLEG): container finished" podID="db8db625-527b-49de-bab0-c2065360d792" containerID="c943900c340586ccd6fb6dfea1e9575c93f20a12cfb4d8dd7d04eaf8b69bb4cb" exitCode=0 Feb 19 21:46:53 crc kubenswrapper[4795]: I0219 21:46:53.962470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" event={"ID":"db8db625-527b-49de-bab0-c2065360d792","Type":"ContainerDied","Data":"c943900c340586ccd6fb6dfea1e9575c93f20a12cfb4d8dd7d04eaf8b69bb4cb"} Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.005043 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.038500 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.087612 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5fdbf9784-tjjsd"] Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.087860 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5fdbf9784-tjjsd" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api-log" containerID="cri-o://628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451" gracePeriod=30 Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.088245 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5fdbf9784-tjjsd" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api" containerID="cri-o://1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721" gracePeriod=30 Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.105978 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5fdbf9784-tjjsd" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.106025 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5fdbf9784-tjjsd" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": EOF" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.301299 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.357097 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-scripts\") pod \"5084e7b9-4923-449e-b0d7-28c602faeff0\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.357361 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5084e7b9-4923-449e-b0d7-28c602faeff0-etc-machine-id\") pod \"5084e7b9-4923-449e-b0d7-28c602faeff0\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.357437 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-db-sync-config-data\") pod \"5084e7b9-4923-449e-b0d7-28c602faeff0\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.357480 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-combined-ca-bundle\") pod \"5084e7b9-4923-449e-b0d7-28c602faeff0\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.357509 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-config-data\") pod \"5084e7b9-4923-449e-b0d7-28c602faeff0\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.357570 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbqsc\" (UniqueName: \"kubernetes.io/projected/5084e7b9-4923-449e-b0d7-28c602faeff0-kube-api-access-qbqsc\") pod \"5084e7b9-4923-449e-b0d7-28c602faeff0\" (UID: \"5084e7b9-4923-449e-b0d7-28c602faeff0\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.360206 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5084e7b9-4923-449e-b0d7-28c602faeff0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5084e7b9-4923-449e-b0d7-28c602faeff0" (UID: "5084e7b9-4923-449e-b0d7-28c602faeff0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.361188 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5084e7b9-4923-449e-b0d7-28c602faeff0" (UID: "5084e7b9-4923-449e-b0d7-28c602faeff0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.365082 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-scripts" (OuterVolumeSpecName: "scripts") pod "5084e7b9-4923-449e-b0d7-28c602faeff0" (UID: "5084e7b9-4923-449e-b0d7-28c602faeff0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.381352 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5084e7b9-4923-449e-b0d7-28c602faeff0-kube-api-access-qbqsc" (OuterVolumeSpecName: "kube-api-access-qbqsc") pod "5084e7b9-4923-449e-b0d7-28c602faeff0" (UID: "5084e7b9-4923-449e-b0d7-28c602faeff0"). InnerVolumeSpecName "kube-api-access-qbqsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.429029 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-config-data" (OuterVolumeSpecName: "config-data") pod "5084e7b9-4923-449e-b0d7-28c602faeff0" (UID: "5084e7b9-4923-449e-b0d7-28c602faeff0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.440301 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5084e7b9-4923-449e-b0d7-28c602faeff0" (UID: "5084e7b9-4923-449e-b0d7-28c602faeff0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.460632 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5084e7b9-4923-449e-b0d7-28c602faeff0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.462312 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.462336 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.462350 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.462639 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbqsc\" (UniqueName: \"kubernetes.io/projected/5084e7b9-4923-449e-b0d7-28c602faeff0-kube-api-access-qbqsc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.462661 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5084e7b9-4923-449e-b0d7-28c602faeff0-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.529764 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.563763 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgvdx\" (UniqueName: \"kubernetes.io/projected/c1ee7c17-521f-45b3-bdb4-748939838e60-kube-api-access-xgvdx\") pod \"c1ee7c17-521f-45b3-bdb4-748939838e60\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.564133 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-combined-ca-bundle\") pod \"c1ee7c17-521f-45b3-bdb4-748939838e60\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.564162 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data-custom\") pod \"c1ee7c17-521f-45b3-bdb4-748939838e60\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.564283 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1ee7c17-521f-45b3-bdb4-748939838e60-logs\") pod \"c1ee7c17-521f-45b3-bdb4-748939838e60\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.564423 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data\") pod \"c1ee7c17-521f-45b3-bdb4-748939838e60\" (UID: \"c1ee7c17-521f-45b3-bdb4-748939838e60\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.571696 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c1ee7c17-521f-45b3-bdb4-748939838e60" (UID: "c1ee7c17-521f-45b3-bdb4-748939838e60"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.573459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1ee7c17-521f-45b3-bdb4-748939838e60-kube-api-access-xgvdx" (OuterVolumeSpecName: "kube-api-access-xgvdx") pod "c1ee7c17-521f-45b3-bdb4-748939838e60" (UID: "c1ee7c17-521f-45b3-bdb4-748939838e60"). InnerVolumeSpecName "kube-api-access-xgvdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.574321 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1ee7c17-521f-45b3-bdb4-748939838e60-logs" (OuterVolumeSpecName: "logs") pod "c1ee7c17-521f-45b3-bdb4-748939838e60" (UID: "c1ee7c17-521f-45b3-bdb4-748939838e60"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.596639 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1ee7c17-521f-45b3-bdb4-748939838e60" (UID: "c1ee7c17-521f-45b3-bdb4-748939838e60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.647374 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.647583 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data" (OuterVolumeSpecName: "config-data") pod "c1ee7c17-521f-45b3-bdb4-748939838e60" (UID: "c1ee7c17-521f-45b3-bdb4-748939838e60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.668074 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-config\") pod \"db8db625-527b-49de-bab0-c2065360d792\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.668145 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfv2b\" (UniqueName: \"kubernetes.io/projected/db8db625-527b-49de-bab0-c2065360d792-kube-api-access-jfv2b\") pod \"db8db625-527b-49de-bab0-c2065360d792\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.668196 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-sb\") pod \"db8db625-527b-49de-bab0-c2065360d792\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.668294 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-nb\") pod \"db8db625-527b-49de-bab0-c2065360d792\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.668344 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-swift-storage-0\") pod \"db8db625-527b-49de-bab0-c2065360d792\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.668378 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-svc\") pod \"db8db625-527b-49de-bab0-c2065360d792\" (UID: \"db8db625-527b-49de-bab0-c2065360d792\") " Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.669094 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgvdx\" (UniqueName: \"kubernetes.io/projected/c1ee7c17-521f-45b3-bdb4-748939838e60-kube-api-access-xgvdx\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.669112 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.669121 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.669129 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1ee7c17-521f-45b3-bdb4-748939838e60-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.669138 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1ee7c17-521f-45b3-bdb4-748939838e60-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.682501 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8db625-527b-49de-bab0-c2065360d792-kube-api-access-jfv2b" (OuterVolumeSpecName: "kube-api-access-jfv2b") pod "db8db625-527b-49de-bab0-c2065360d792" (UID: "db8db625-527b-49de-bab0-c2065360d792"). InnerVolumeSpecName "kube-api-access-jfv2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.735072 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "db8db625-527b-49de-bab0-c2065360d792" (UID: "db8db625-527b-49de-bab0-c2065360d792"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.738762 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "db8db625-527b-49de-bab0-c2065360d792" (UID: "db8db625-527b-49de-bab0-c2065360d792"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.757679 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-config" (OuterVolumeSpecName: "config") pod "db8db625-527b-49de-bab0-c2065360d792" (UID: "db8db625-527b-49de-bab0-c2065360d792"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.758615 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "db8db625-527b-49de-bab0-c2065360d792" (UID: "db8db625-527b-49de-bab0-c2065360d792"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.769984 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.770027 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfv2b\" (UniqueName: \"kubernetes.io/projected/db8db625-527b-49de-bab0-c2065360d792-kube-api-access-jfv2b\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.770037 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.770047 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.770056 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.778079 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "db8db625-527b-49de-bab0-c2065360d792" (UID: "db8db625-527b-49de-bab0-c2065360d792"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.871317 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db8db625-527b-49de-bab0-c2065360d792-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.972978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerStarted","Data":"99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf"} Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.973110 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="ceilometer-central-agent" containerID="cri-o://b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81" gracePeriod=30 Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.973239 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.973251 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="proxy-httpd" containerID="cri-o://99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf" gracePeriod=30 Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.973311 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="sg-core" containerID="cri-o://f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62" gracePeriod=30 Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.973364 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="ceilometer-notification-agent" containerID="cri-o://e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd" gracePeriod=30 Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.983943 4795 generic.go:334] "Generic (PLEG): container finished" podID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerID="628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451" exitCode=143 Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.984033 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fdbf9784-tjjsd" event={"ID":"ff9dd6ee-d043-41af-bcfa-8385ae786038","Type":"ContainerDied","Data":"628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451"} Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.990111 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f5874b546-54bp8" event={"ID":"c1ee7c17-521f-45b3-bdb4-748939838e60","Type":"ContainerDied","Data":"be89fa2e2a317aa99f43093ff5681ba4ce22ead52fd8adae9138024217c6894f"} Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.990156 4795 scope.go:117] "RemoveContainer" containerID="bb53ff7aacbd051823727a3e5e9c0df1aa87aded1e6321d5bbd77a8a437a808e" Feb 19 21:46:54 crc kubenswrapper[4795]: I0219 21:46:54.990304 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f5874b546-54bp8" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.003133 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.850288343 podStartE2EDuration="46.003120068s" podCreationTimestamp="2026-02-19 21:46:09 +0000 UTC" firstStartedPulling="2026-02-19 21:46:10.187077849 +0000 UTC m=+1081.379595713" lastFinishedPulling="2026-02-19 21:46:54.339909574 +0000 UTC m=+1125.532427438" observedRunningTime="2026-02-19 21:46:55.001498892 +0000 UTC m=+1126.194016756" watchObservedRunningTime="2026-02-19 21:46:55.003120068 +0000 UTC m=+1126.195637932" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.025754 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" event={"ID":"db8db625-527b-49de-bab0-c2065360d792","Type":"ContainerDied","Data":"4fa1b7dc967ebd0720fc00eaa4e4599d7f0abf1d8d22f716edb0032cd429d0aa"} Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.025879 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-rxl4z" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.032928 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zjbsw" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.033016 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zjbsw" event={"ID":"5084e7b9-4923-449e-b0d7-28c602faeff0","Type":"ContainerDied","Data":"afea16e4a0f822c8b9769634b3e5dede9add2d921ddeb3308c942ab11c4579f9"} Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.033046 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afea16e4a0f822c8b9769634b3e5dede9add2d921ddeb3308c942ab11c4579f9" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.050464 4795 scope.go:117] "RemoveContainer" containerID="d05922b241c5606cc7cb67f0611459164ffa20eef296f34ff91982e9383ca569" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.077247 4795 scope.go:117] "RemoveContainer" containerID="c943900c340586ccd6fb6dfea1e9575c93f20a12cfb4d8dd7d04eaf8b69bb4cb" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.096831 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5f5874b546-54bp8"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.112415 4795 scope.go:117] "RemoveContainer" containerID="3f9a72a27cc208f60203c5a7cd9bfc613d8657082c6f35b5044badc8bc0caace" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.112670 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5f5874b546-54bp8"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.120634 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-rxl4z"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.128156 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-rxl4z"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.520729 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" path="/var/lib/kubelet/pods/c1ee7c17-521f-45b3-bdb4-748939838e60/volumes" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.521318 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db8db625-527b-49de-bab0-c2065360d792" path="/var/lib/kubelet/pods/db8db625-527b-49de-bab0-c2065360d792/volumes" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.606801 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607152 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api-log" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607195 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api-log" Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607210 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62013c47-67bc-44c5-a250-390102661c05" containerName="barbican-keystone-listener-log" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607217 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="62013c47-67bc-44c5-a250-390102661c05" containerName="barbican-keystone-listener-log" Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607227 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5084e7b9-4923-449e-b0d7-28c602faeff0" containerName="cinder-db-sync" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607233 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5084e7b9-4923-449e-b0d7-28c602faeff0" containerName="cinder-db-sync" Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607244 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607250 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api" Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607259 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerName="barbican-worker" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607265 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerName="barbican-worker" Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607276 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8db625-527b-49de-bab0-c2065360d792" containerName="init" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607281 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8db625-527b-49de-bab0-c2065360d792" containerName="init" Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607289 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerName="barbican-worker-log" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607295 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerName="barbican-worker-log" Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607301 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8db625-527b-49de-bab0-c2065360d792" containerName="dnsmasq-dns" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607309 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8db625-527b-49de-bab0-c2065360d792" containerName="dnsmasq-dns" Feb 19 21:46:55 crc kubenswrapper[4795]: E0219 21:46:55.607323 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62013c47-67bc-44c5-a250-390102661c05" containerName="barbican-keystone-listener" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607329 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="62013c47-67bc-44c5-a250-390102661c05" containerName="barbican-keystone-listener" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607496 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607512 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="62013c47-67bc-44c5-a250-390102661c05" containerName="barbican-keystone-listener" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607520 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="db8db625-527b-49de-bab0-c2065360d792" containerName="dnsmasq-dns" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607528 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerName="barbican-worker-log" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607544 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5084e7b9-4923-449e-b0d7-28c602faeff0" containerName="cinder-db-sync" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607561 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1ee7c17-521f-45b3-bdb4-748939838e60" containerName="barbican-api-log" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607574 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="62013c47-67bc-44c5-a250-390102661c05" containerName="barbican-keystone-listener-log" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.607585 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="159cf586-d51b-49e8-bea8-99af238b8a3e" containerName="barbican-worker" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.614202 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.619446 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.619609 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.619630 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.622076 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wf9zm" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.625045 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.662996 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-wgth2"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.664370 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.688001 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-wgth2"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700021 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700084 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hb2s\" (UniqueName: \"kubernetes.io/projected/df387754-5537-4d85-950b-02743c881da8-kube-api-access-2hb2s\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700114 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2384b48a-ae68-4495-9c68-2faf894de9f9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700193 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-scripts\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-svc\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700464 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-swift-storage-0\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700511 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5clv6\" (UniqueName: \"kubernetes.io/projected/2384b48a-ae68-4495-9c68-2faf894de9f9-kube-api-access-5clv6\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-config\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.700585 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5clv6\" (UniqueName: \"kubernetes.io/projected/2384b48a-ae68-4495-9c68-2faf894de9f9-kube-api-access-5clv6\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803092 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-config\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803144 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803183 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803206 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hb2s\" (UniqueName: \"kubernetes.io/projected/df387754-5537-4d85-950b-02743c881da8-kube-api-access-2hb2s\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803222 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2384b48a-ae68-4495-9c68-2faf894de9f9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803250 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-scripts\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803265 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803290 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-svc\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.803348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-swift-storage-0\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.804298 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-swift-storage-0\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.804352 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2384b48a-ae68-4495-9c68-2faf894de9f9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.804967 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-config\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.805322 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.805328 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.805698 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-svc\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.811395 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.813363 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.822690 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-scripts\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.830761 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.840856 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hb2s\" (UniqueName: \"kubernetes.io/projected/df387754-5537-4d85-950b-02743c881da8-kube-api-access-2hb2s\") pod \"dnsmasq-dns-6c8dc7b4d9-wgth2\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.845027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5clv6\" (UniqueName: \"kubernetes.io/projected/2384b48a-ae68-4495-9c68-2faf894de9f9-kube-api-access-5clv6\") pod \"cinder-scheduler-0\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.907215 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.908636 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.913331 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.921410 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.932546 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:46:55 crc kubenswrapper[4795]: I0219 21:46:55.988128 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.016062 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-scripts\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.016120 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff4a455-15cc-4733-adfd-0f27404e54ed-logs\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.016181 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8fxd\" (UniqueName: \"kubernetes.io/projected/dff4a455-15cc-4733-adfd-0f27404e54ed-kube-api-access-r8fxd\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.016235 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.016262 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.016283 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dff4a455-15cc-4733-adfd-0f27404e54ed-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.016325 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data-custom\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.079892 4795 generic.go:334] "Generic (PLEG): container finished" podID="f6698443-b029-4098-81d6-dba6d5f239f2" containerID="99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf" exitCode=0 Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.080242 4795 generic.go:334] "Generic (PLEG): container finished" podID="f6698443-b029-4098-81d6-dba6d5f239f2" containerID="f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62" exitCode=2 Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.080252 4795 generic.go:334] "Generic (PLEG): container finished" podID="f6698443-b029-4098-81d6-dba6d5f239f2" containerID="b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81" exitCode=0 Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.080002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerDied","Data":"99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf"} Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.080334 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerDied","Data":"f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62"} Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.080360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerDied","Data":"b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81"} Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119090 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8fxd\" (UniqueName: \"kubernetes.io/projected/dff4a455-15cc-4733-adfd-0f27404e54ed-kube-api-access-r8fxd\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119240 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119296 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dff4a455-15cc-4733-adfd-0f27404e54ed-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119346 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data-custom\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119375 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-scripts\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119400 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff4a455-15cc-4733-adfd-0f27404e54ed-logs\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dff4a455-15cc-4733-adfd-0f27404e54ed-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.119997 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff4a455-15cc-4733-adfd-0f27404e54ed-logs\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.123980 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.124400 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.124974 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data-custom\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.125318 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-scripts\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.140574 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8fxd\" (UniqueName: \"kubernetes.io/projected/dff4a455-15cc-4733-adfd-0f27404e54ed-kube-api-access-r8fxd\") pod \"cinder-api-0\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.252080 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.584766 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:46:56 crc kubenswrapper[4795]: W0219 21:46:56.587227 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2384b48a_ae68_4495_9c68_2faf894de9f9.slice/crio-1bb0109c274477813b9f48090bc2380bc85368bf7c3040d3068fd48ba9c4524c WatchSource:0}: Error finding container 1bb0109c274477813b9f48090bc2380bc85368bf7c3040d3068fd48ba9c4524c: Status 404 returned error can't find the container with id 1bb0109c274477813b9f48090bc2380bc85368bf7c3040d3068fd48ba9c4524c Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.700856 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-wgth2"] Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.833042 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:46:56 crc kubenswrapper[4795]: W0219 21:46:56.859314 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddff4a455_15cc_4733_adfd_0f27404e54ed.slice/crio-34626286d7c48cbd7fc95276ad3459553314042c017067627d69a5fb7eeaa601 WatchSource:0}: Error finding container 34626286d7c48cbd7fc95276ad3459553314042c017067627d69a5fb7eeaa601: Status 404 returned error can't find the container with id 34626286d7c48cbd7fc95276ad3459553314042c017067627d69a5fb7eeaa601 Feb 19 21:46:56 crc kubenswrapper[4795]: I0219 21:46:56.940576 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.038370 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-sg-core-conf-yaml\") pod \"f6698443-b029-4098-81d6-dba6d5f239f2\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.038651 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-scripts\") pod \"f6698443-b029-4098-81d6-dba6d5f239f2\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.038722 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drpsg\" (UniqueName: \"kubernetes.io/projected/f6698443-b029-4098-81d6-dba6d5f239f2-kube-api-access-drpsg\") pod \"f6698443-b029-4098-81d6-dba6d5f239f2\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.038765 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-log-httpd\") pod \"f6698443-b029-4098-81d6-dba6d5f239f2\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.038895 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-run-httpd\") pod \"f6698443-b029-4098-81d6-dba6d5f239f2\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.038942 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-config-data\") pod \"f6698443-b029-4098-81d6-dba6d5f239f2\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.038990 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-combined-ca-bundle\") pod \"f6698443-b029-4098-81d6-dba6d5f239f2\" (UID: \"f6698443-b029-4098-81d6-dba6d5f239f2\") " Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.040477 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f6698443-b029-4098-81d6-dba6d5f239f2" (UID: "f6698443-b029-4098-81d6-dba6d5f239f2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.041672 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f6698443-b029-4098-81d6-dba6d5f239f2" (UID: "f6698443-b029-4098-81d6-dba6d5f239f2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.044946 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-scripts" (OuterVolumeSpecName: "scripts") pod "f6698443-b029-4098-81d6-dba6d5f239f2" (UID: "f6698443-b029-4098-81d6-dba6d5f239f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.047849 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6698443-b029-4098-81d6-dba6d5f239f2-kube-api-access-drpsg" (OuterVolumeSpecName: "kube-api-access-drpsg") pod "f6698443-b029-4098-81d6-dba6d5f239f2" (UID: "f6698443-b029-4098-81d6-dba6d5f239f2"). InnerVolumeSpecName "kube-api-access-drpsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.065100 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f6698443-b029-4098-81d6-dba6d5f239f2" (UID: "f6698443-b029-4098-81d6-dba6d5f239f2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.093068 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dff4a455-15cc-4733-adfd-0f27404e54ed","Type":"ContainerStarted","Data":"34626286d7c48cbd7fc95276ad3459553314042c017067627d69a5fb7eeaa601"} Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.094524 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2384b48a-ae68-4495-9c68-2faf894de9f9","Type":"ContainerStarted","Data":"1bb0109c274477813b9f48090bc2380bc85368bf7c3040d3068fd48ba9c4524c"} Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.098652 4795 generic.go:334] "Generic (PLEG): container finished" podID="f6698443-b029-4098-81d6-dba6d5f239f2" containerID="e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd" exitCode=0 Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.098736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerDied","Data":"e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd"} Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.098765 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.098786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6698443-b029-4098-81d6-dba6d5f239f2","Type":"ContainerDied","Data":"333abcac5e9ec8b6d96f2784182bddc2611944e9296eb36664d925e9b90b96b2"} Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.098805 4795 scope.go:117] "RemoveContainer" containerID="99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.113739 4795 generic.go:334] "Generic (PLEG): container finished" podID="df387754-5537-4d85-950b-02743c881da8" containerID="93a750f0a2156e5774cb54d38a230aaaa13bdcb55ce20e481877fcd04f00ee2e" exitCode=0 Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.113786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" event={"ID":"df387754-5537-4d85-950b-02743c881da8","Type":"ContainerDied","Data":"93a750f0a2156e5774cb54d38a230aaaa13bdcb55ce20e481877fcd04f00ee2e"} Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.113834 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" event={"ID":"df387754-5537-4d85-950b-02743c881da8","Type":"ContainerStarted","Data":"8495c4433e798d1ac2e79ddf9c88ebf569c3cedf6bdc46579d7bb36aaf2eff72"} Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.117079 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6698443-b029-4098-81d6-dba6d5f239f2" (UID: "f6698443-b029-4098-81d6-dba6d5f239f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.118356 4795 scope.go:117] "RemoveContainer" containerID="f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.138857 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-config-data" (OuterVolumeSpecName: "config-data") pod "f6698443-b029-4098-81d6-dba6d5f239f2" (UID: "f6698443-b029-4098-81d6-dba6d5f239f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.141507 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.141528 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drpsg\" (UniqueName: \"kubernetes.io/projected/f6698443-b029-4098-81d6-dba6d5f239f2-kube-api-access-drpsg\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.141540 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.141549 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6698443-b029-4098-81d6-dba6d5f239f2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.141556 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.141565 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.141573 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6698443-b029-4098-81d6-dba6d5f239f2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.153457 4795 scope.go:117] "RemoveContainer" containerID="e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.331274 4795 scope.go:117] "RemoveContainer" containerID="b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.360574 4795 scope.go:117] "RemoveContainer" containerID="99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf" Feb 19 21:46:57 crc kubenswrapper[4795]: E0219 21:46:57.361543 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf\": container with ID starting with 99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf not found: ID does not exist" containerID="99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.361586 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf"} err="failed to get container status \"99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf\": rpc error: code = NotFound desc = could not find container \"99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf\": container with ID starting with 99e3eeb18daa232515a54095f899a9b422cbf1a8c8121e9f859ed25245a01eaf not found: ID does not exist" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.361611 4795 scope.go:117] "RemoveContainer" containerID="f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62" Feb 19 21:46:57 crc kubenswrapper[4795]: E0219 21:46:57.361889 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62\": container with ID starting with f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62 not found: ID does not exist" containerID="f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.361932 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62"} err="failed to get container status \"f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62\": rpc error: code = NotFound desc = could not find container \"f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62\": container with ID starting with f0e44fbdb8e3ea25287eea5edbbd58cd883e463ecd967da058e1d18fafd8bc62 not found: ID does not exist" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.361961 4795 scope.go:117] "RemoveContainer" containerID="e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd" Feb 19 21:46:57 crc kubenswrapper[4795]: E0219 21:46:57.362225 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd\": container with ID starting with e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd not found: ID does not exist" containerID="e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.362248 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd"} err="failed to get container status \"e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd\": rpc error: code = NotFound desc = could not find container \"e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd\": container with ID starting with e126e33b52d2dc6b19be356eca9905c5a1a510ca7016f4ef6d57dc72b82f12fd not found: ID does not exist" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.362263 4795 scope.go:117] "RemoveContainer" containerID="b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81" Feb 19 21:46:57 crc kubenswrapper[4795]: E0219 21:46:57.362582 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81\": container with ID starting with b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81 not found: ID does not exist" containerID="b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.362609 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81"} err="failed to get container status \"b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81\": rpc error: code = NotFound desc = could not find container \"b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81\": container with ID starting with b61288d5e6e3a8ef1a9301b874e513c48be039a918932bba9b9edb5495a72e81 not found: ID does not exist" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.568370 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.579759 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625141 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:57 crc kubenswrapper[4795]: E0219 21:46:57.625590 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="ceilometer-central-agent" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625606 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="ceilometer-central-agent" Feb 19 21:46:57 crc kubenswrapper[4795]: E0219 21:46:57.625627 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="ceilometer-notification-agent" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625634 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="ceilometer-notification-agent" Feb 19 21:46:57 crc kubenswrapper[4795]: E0219 21:46:57.625663 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="sg-core" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625669 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="sg-core" Feb 19 21:46:57 crc kubenswrapper[4795]: E0219 21:46:57.625681 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="proxy-httpd" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625687 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="proxy-httpd" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625860 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="ceilometer-notification-agent" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625881 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="sg-core" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625897 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="proxy-httpd" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.625909 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" containerName="ceilometer-central-agent" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.627417 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.630520 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.630913 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.652038 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.652640 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-scripts\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.652682 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59wzg\" (UniqueName: \"kubernetes.io/projected/d2b418ec-23ae-4edd-8e61-0522a69c6be4-kube-api-access-59wzg\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.652706 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-run-httpd\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.652726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.652765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-config-data\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.652800 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-log-httpd\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.652826 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.754074 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.754472 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-scripts\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.754497 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59wzg\" (UniqueName: \"kubernetes.io/projected/d2b418ec-23ae-4edd-8e61-0522a69c6be4-kube-api-access-59wzg\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.754519 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-run-httpd\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.754539 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.754575 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-config-data\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.754611 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-log-httpd\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.755046 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-log-httpd\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.755289 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-run-httpd\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.762935 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.769894 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-scripts\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.770301 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-config-data\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.778932 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.784692 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59wzg\" (UniqueName: \"kubernetes.io/projected/d2b418ec-23ae-4edd-8e61-0522a69c6be4-kube-api-access-59wzg\") pod \"ceilometer-0\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " pod="openstack/ceilometer-0" Feb 19 21:46:57 crc kubenswrapper[4795]: I0219 21:46:57.951102 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.123292 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dff4a455-15cc-4733-adfd-0f27404e54ed","Type":"ContainerStarted","Data":"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14"} Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.129805 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" event={"ID":"df387754-5537-4d85-950b-02743c881da8","Type":"ContainerStarted","Data":"70af76578d539ade05715b4f45e530828d624afd3981206c0f1b5d67746b15d9"} Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.129943 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.167889 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" podStartSLOduration=3.167872322 podStartE2EDuration="3.167872322s" podCreationTimestamp="2026-02-19 21:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:58.153339542 +0000 UTC m=+1129.345857406" watchObservedRunningTime="2026-02-19 21:46:58.167872322 +0000 UTC m=+1129.360390186" Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.278013 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.427178 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.427515 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.472394 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:46:58 crc kubenswrapper[4795]: W0219 21:46:58.504570 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2b418ec_23ae_4edd_8e61_0522a69c6be4.slice/crio-aca75b960d03859bf274a8a3797de1abbc9baeeee38219c6dcc1576889b95f11 WatchSource:0}: Error finding container aca75b960d03859bf274a8a3797de1abbc9baeeee38219c6dcc1576889b95f11: Status 404 returned error can't find the container with id aca75b960d03859bf274a8a3797de1abbc9baeeee38219c6dcc1576889b95f11 Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.599972 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5fdbf9784-tjjsd" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:39470->10.217.0.161:9311: read: connection reset by peer" Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.600020 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5fdbf9784-tjjsd" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:37784->10.217.0.161:9311: read: connection reset by peer" Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.600331 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5fdbf9784-tjjsd" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": dial tcp 10.217.0.161:9311: connect: connection refused" Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.934926 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.991825 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4krhj\" (UniqueName: \"kubernetes.io/projected/ff9dd6ee-d043-41af-bcfa-8385ae786038-kube-api-access-4krhj\") pod \"ff9dd6ee-d043-41af-bcfa-8385ae786038\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.991963 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-combined-ca-bundle\") pod \"ff9dd6ee-d043-41af-bcfa-8385ae786038\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.992052 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff9dd6ee-d043-41af-bcfa-8385ae786038-logs\") pod \"ff9dd6ee-d043-41af-bcfa-8385ae786038\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.992233 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data\") pod \"ff9dd6ee-d043-41af-bcfa-8385ae786038\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " Feb 19 21:46:58 crc kubenswrapper[4795]: I0219 21:46:58.992275 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data-custom\") pod \"ff9dd6ee-d043-41af-bcfa-8385ae786038\" (UID: \"ff9dd6ee-d043-41af-bcfa-8385ae786038\") " Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:58.992681 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff9dd6ee-d043-41af-bcfa-8385ae786038-logs" (OuterVolumeSpecName: "logs") pod "ff9dd6ee-d043-41af-bcfa-8385ae786038" (UID: "ff9dd6ee-d043-41af-bcfa-8385ae786038"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.000226 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff9dd6ee-d043-41af-bcfa-8385ae786038-kube-api-access-4krhj" (OuterVolumeSpecName: "kube-api-access-4krhj") pod "ff9dd6ee-d043-41af-bcfa-8385ae786038" (UID: "ff9dd6ee-d043-41af-bcfa-8385ae786038"). InnerVolumeSpecName "kube-api-access-4krhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.004766 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ff9dd6ee-d043-41af-bcfa-8385ae786038" (UID: "ff9dd6ee-d043-41af-bcfa-8385ae786038"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.046496 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff9dd6ee-d043-41af-bcfa-8385ae786038" (UID: "ff9dd6ee-d043-41af-bcfa-8385ae786038"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.072531 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data" (OuterVolumeSpecName: "config-data") pod "ff9dd6ee-d043-41af-bcfa-8385ae786038" (UID: "ff9dd6ee-d043-41af-bcfa-8385ae786038"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.094707 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.094743 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.094757 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4krhj\" (UniqueName: \"kubernetes.io/projected/ff9dd6ee-d043-41af-bcfa-8385ae786038-kube-api-access-4krhj\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.094768 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff9dd6ee-d043-41af-bcfa-8385ae786038-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.094780 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff9dd6ee-d043-41af-bcfa-8385ae786038-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.141445 4795 generic.go:334] "Generic (PLEG): container finished" podID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerID="1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721" exitCode=0 Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.141489 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fdbf9784-tjjsd" event={"ID":"ff9dd6ee-d043-41af-bcfa-8385ae786038","Type":"ContainerDied","Data":"1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721"} Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.141532 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5fdbf9784-tjjsd" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.141555 4795 scope.go:117] "RemoveContainer" containerID="1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.141542 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5fdbf9784-tjjsd" event={"ID":"ff9dd6ee-d043-41af-bcfa-8385ae786038","Type":"ContainerDied","Data":"86ab35cfd9c80c07dee3068dbe14e2a6dadda5cbd9e0d550648a5054e671ff43"} Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.149087 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dff4a455-15cc-4733-adfd-0f27404e54ed","Type":"ContainerStarted","Data":"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237"} Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.149285 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerName="cinder-api-log" containerID="cri-o://de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14" gracePeriod=30 Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.149346 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerName="cinder-api" containerID="cri-o://be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237" gracePeriod=30 Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.149436 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.157553 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerStarted","Data":"f5d643cd66b59a6396d5feca0a7263f30adfef3399f6c9e0364f6fef62658bb6"} Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.157605 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerStarted","Data":"aca75b960d03859bf274a8a3797de1abbc9baeeee38219c6dcc1576889b95f11"} Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.159576 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2384b48a-ae68-4495-9c68-2faf894de9f9","Type":"ContainerStarted","Data":"9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c"} Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.159627 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2384b48a-ae68-4495-9c68-2faf894de9f9","Type":"ContainerStarted","Data":"749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad"} Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.173328 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.173311988 podStartE2EDuration="4.173311988s" podCreationTimestamp="2026-02-19 21:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:46:59.169730297 +0000 UTC m=+1130.362248161" watchObservedRunningTime="2026-02-19 21:46:59.173311988 +0000 UTC m=+1130.365829852" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.180862 4795 scope.go:117] "RemoveContainer" containerID="628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.207879 4795 scope.go:117] "RemoveContainer" containerID="1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.211315 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.416279198 podStartE2EDuration="4.211297259s" podCreationTimestamp="2026-02-19 21:46:55 +0000 UTC" firstStartedPulling="2026-02-19 21:46:56.590459146 +0000 UTC m=+1127.782977010" lastFinishedPulling="2026-02-19 21:46:57.385477207 +0000 UTC m=+1128.577995071" observedRunningTime="2026-02-19 21:46:59.190295457 +0000 UTC m=+1130.382813331" watchObservedRunningTime="2026-02-19 21:46:59.211297259 +0000 UTC m=+1130.403815123" Feb 19 21:46:59 crc kubenswrapper[4795]: E0219 21:46:59.222785 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721\": container with ID starting with 1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721 not found: ID does not exist" containerID="1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.222887 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721"} err="failed to get container status \"1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721\": rpc error: code = NotFound desc = could not find container \"1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721\": container with ID starting with 1e747e5d80d698d410aec3a2712fd1002c46a3b555382ce77d51ba1659f59721 not found: ID does not exist" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.222929 4795 scope.go:117] "RemoveContainer" containerID="628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451" Feb 19 21:46:59 crc kubenswrapper[4795]: E0219 21:46:59.223601 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451\": container with ID starting with 628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451 not found: ID does not exist" containerID="628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.223654 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451"} err="failed to get container status \"628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451\": rpc error: code = NotFound desc = could not find container \"628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451\": container with ID starting with 628cf011c1e3cfa652516057f286c880a7bb4ae1632cbe08579b65a9e2873451 not found: ID does not exist" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.237110 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5fdbf9784-tjjsd"] Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.248222 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5fdbf9784-tjjsd"] Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.542007 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6698443-b029-4098-81d6-dba6d5f239f2" path="/var/lib/kubelet/pods/f6698443-b029-4098-81d6-dba6d5f239f2/volumes" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.543415 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" path="/var/lib/kubelet/pods/ff9dd6ee-d043-41af-bcfa-8385ae786038/volumes" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.750039 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.808619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-scripts\") pod \"dff4a455-15cc-4733-adfd-0f27404e54ed\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.808850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dff4a455-15cc-4733-adfd-0f27404e54ed-etc-machine-id\") pod \"dff4a455-15cc-4733-adfd-0f27404e54ed\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.808997 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data\") pod \"dff4a455-15cc-4733-adfd-0f27404e54ed\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.809103 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-combined-ca-bundle\") pod \"dff4a455-15cc-4733-adfd-0f27404e54ed\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.809224 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8fxd\" (UniqueName: \"kubernetes.io/projected/dff4a455-15cc-4733-adfd-0f27404e54ed-kube-api-access-r8fxd\") pod \"dff4a455-15cc-4733-adfd-0f27404e54ed\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.809599 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff4a455-15cc-4733-adfd-0f27404e54ed-logs\") pod \"dff4a455-15cc-4733-adfd-0f27404e54ed\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.809719 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data-custom\") pod \"dff4a455-15cc-4733-adfd-0f27404e54ed\" (UID: \"dff4a455-15cc-4733-adfd-0f27404e54ed\") " Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.811915 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dff4a455-15cc-4733-adfd-0f27404e54ed-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dff4a455-15cc-4733-adfd-0f27404e54ed" (UID: "dff4a455-15cc-4733-adfd-0f27404e54ed"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.812663 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff4a455-15cc-4733-adfd-0f27404e54ed-logs" (OuterVolumeSpecName: "logs") pod "dff4a455-15cc-4733-adfd-0f27404e54ed" (UID: "dff4a455-15cc-4733-adfd-0f27404e54ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.814212 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dff4a455-15cc-4733-adfd-0f27404e54ed" (UID: "dff4a455-15cc-4733-adfd-0f27404e54ed"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.816330 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-scripts" (OuterVolumeSpecName: "scripts") pod "dff4a455-15cc-4733-adfd-0f27404e54ed" (UID: "dff4a455-15cc-4733-adfd-0f27404e54ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.818387 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff4a455-15cc-4733-adfd-0f27404e54ed-kube-api-access-r8fxd" (OuterVolumeSpecName: "kube-api-access-r8fxd") pod "dff4a455-15cc-4733-adfd-0f27404e54ed" (UID: "dff4a455-15cc-4733-adfd-0f27404e54ed"). InnerVolumeSpecName "kube-api-access-r8fxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.848349 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dff4a455-15cc-4733-adfd-0f27404e54ed" (UID: "dff4a455-15cc-4733-adfd-0f27404e54ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.885988 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data" (OuterVolumeSpecName: "config-data") pod "dff4a455-15cc-4733-adfd-0f27404e54ed" (UID: "dff4a455-15cc-4733-adfd-0f27404e54ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.911995 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.912028 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dff4a455-15cc-4733-adfd-0f27404e54ed-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.912037 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.912046 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.912056 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8fxd\" (UniqueName: \"kubernetes.io/projected/dff4a455-15cc-4733-adfd-0f27404e54ed-kube-api-access-r8fxd\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.912065 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dff4a455-15cc-4733-adfd-0f27404e54ed-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:46:59 crc kubenswrapper[4795]: I0219 21:46:59.912073 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dff4a455-15cc-4733-adfd-0f27404e54ed-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.169679 4795 generic.go:334] "Generic (PLEG): container finished" podID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerID="be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237" exitCode=0 Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.169708 4795 generic.go:334] "Generic (PLEG): container finished" podID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerID="de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14" exitCode=143 Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.169753 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.169765 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dff4a455-15cc-4733-adfd-0f27404e54ed","Type":"ContainerDied","Data":"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237"} Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.169794 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dff4a455-15cc-4733-adfd-0f27404e54ed","Type":"ContainerDied","Data":"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14"} Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.169805 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dff4a455-15cc-4733-adfd-0f27404e54ed","Type":"ContainerDied","Data":"34626286d7c48cbd7fc95276ad3459553314042c017067627d69a5fb7eeaa601"} Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.169820 4795 scope.go:117] "RemoveContainer" containerID="be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.172417 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerStarted","Data":"fdbe214ee22443ca571dd54e58e45a8266df69233f71d59ca89e8fe90ae2ab86"} Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.203070 4795 scope.go:117] "RemoveContainer" containerID="de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.214354 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.224275 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.236723 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:47:00 crc kubenswrapper[4795]: E0219 21:47:00.237034 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerName="cinder-api-log" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.237050 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerName="cinder-api-log" Feb 19 21:47:00 crc kubenswrapper[4795]: E0219 21:47:00.237073 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.237079 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api" Feb 19 21:47:00 crc kubenswrapper[4795]: E0219 21:47:00.237107 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerName="cinder-api" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.237113 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerName="cinder-api" Feb 19 21:47:00 crc kubenswrapper[4795]: E0219 21:47:00.237123 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api-log" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.237130 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api-log" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.237282 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerName="cinder-api" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.237297 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api-log" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.237314 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9dd6ee-d043-41af-bcfa-8385ae786038" containerName="barbican-api" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.237324 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" containerName="cinder-api-log" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.238104 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.251798 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.252054 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.253115 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.278743 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.319949 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.319997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj2cq\" (UniqueName: \"kubernetes.io/projected/d2561f4e-0a01-4927-96f8-ee7bef69f561-kube-api-access-gj2cq\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.320030 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2561f4e-0a01-4927-96f8-ee7bef69f561-logs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.320235 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-scripts\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.320287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.320342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.320383 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data-custom\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.320486 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.320645 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2561f4e-0a01-4927-96f8-ee7bef69f561-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.402389 4795 scope.go:117] "RemoveContainer" containerID="be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237" Feb 19 21:47:00 crc kubenswrapper[4795]: E0219 21:47:00.403597 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237\": container with ID starting with be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237 not found: ID does not exist" containerID="be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.403650 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237"} err="failed to get container status \"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237\": rpc error: code = NotFound desc = could not find container \"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237\": container with ID starting with be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237 not found: ID does not exist" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.403680 4795 scope.go:117] "RemoveContainer" containerID="de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14" Feb 19 21:47:00 crc kubenswrapper[4795]: E0219 21:47:00.407465 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14\": container with ID starting with de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14 not found: ID does not exist" containerID="de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.407499 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14"} err="failed to get container status \"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14\": rpc error: code = NotFound desc = could not find container \"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14\": container with ID starting with de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14 not found: ID does not exist" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.407520 4795 scope.go:117] "RemoveContainer" containerID="be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.407875 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237"} err="failed to get container status \"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237\": rpc error: code = NotFound desc = could not find container \"be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237\": container with ID starting with be49afdae72352ef031139ffd60ebb1ff9f28bf334ebee8797dcc75c7cc2b237 not found: ID does not exist" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.407929 4795 scope.go:117] "RemoveContainer" containerID="de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.408531 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14"} err="failed to get container status \"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14\": rpc error: code = NotFound desc = could not find container \"de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14\": container with ID starting with de5cd4ebec90cb75563e48bb778e5b039f23386342e708db9e76d781fc48ea14 not found: ID does not exist" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422486 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2561f4e-0a01-4927-96f8-ee7bef69f561-logs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422547 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-scripts\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422582 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422598 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data-custom\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422635 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422687 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2561f4e-0a01-4927-96f8-ee7bef69f561-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.422758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj2cq\" (UniqueName: \"kubernetes.io/projected/d2561f4e-0a01-4927-96f8-ee7bef69f561-kube-api-access-gj2cq\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.423283 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2561f4e-0a01-4927-96f8-ee7bef69f561-logs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.423352 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2561f4e-0a01-4927-96f8-ee7bef69f561-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.432610 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.432786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data-custom\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.439428 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.439658 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.451833 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-scripts\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.458259 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.464895 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj2cq\" (UniqueName: \"kubernetes.io/projected/d2561f4e-0a01-4927-96f8-ee7bef69f561-kube-api-access-gj2cq\") pod \"cinder-api-0\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.690765 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:47:00 crc kubenswrapper[4795]: I0219 21:47:00.934040 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 21:47:01 crc kubenswrapper[4795]: I0219 21:47:01.157479 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:47:01 crc kubenswrapper[4795]: I0219 21:47:01.180515 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d2561f4e-0a01-4927-96f8-ee7bef69f561","Type":"ContainerStarted","Data":"72a722e4ebd20de4e2ab880d4812af758e115c7dc2dbe4b6fadf7ad0adda880d"} Feb 19 21:47:01 crc kubenswrapper[4795]: I0219 21:47:01.185644 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerStarted","Data":"a592134b61717d57f63788206b6ae0f1532d3c76a4a60eee8a3ddfb80940ab9a"} Feb 19 21:47:01 crc kubenswrapper[4795]: I0219 21:47:01.523578 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff4a455-15cc-4733-adfd-0f27404e54ed" path="/var/lib/kubelet/pods/dff4a455-15cc-4733-adfd-0f27404e54ed/volumes" Feb 19 21:47:02 crc kubenswrapper[4795]: I0219 21:47:02.196630 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerStarted","Data":"d19bddd1f9142f8194c0a0050a40c02459b91997d36600ebdae2e82529e31c8c"} Feb 19 21:47:02 crc kubenswrapper[4795]: I0219 21:47:02.196954 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:47:02 crc kubenswrapper[4795]: I0219 21:47:02.201394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d2561f4e-0a01-4927-96f8-ee7bef69f561","Type":"ContainerStarted","Data":"46487241a29d4cc3bff33a03b2f13ce2e328740a30388d55d6c987233cf2d399"} Feb 19 21:47:02 crc kubenswrapper[4795]: I0219 21:47:02.225703 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8638595100000002 podStartE2EDuration="5.225685472s" podCreationTimestamp="2026-02-19 21:46:57 +0000 UTC" firstStartedPulling="2026-02-19 21:46:58.516183685 +0000 UTC m=+1129.708701549" lastFinishedPulling="2026-02-19 21:47:01.878009637 +0000 UTC m=+1133.070527511" observedRunningTime="2026-02-19 21:47:02.220939269 +0000 UTC m=+1133.413457123" watchObservedRunningTime="2026-02-19 21:47:02.225685472 +0000 UTC m=+1133.418203336" Feb 19 21:47:03 crc kubenswrapper[4795]: I0219 21:47:03.211774 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d2561f4e-0a01-4927-96f8-ee7bef69f561","Type":"ContainerStarted","Data":"067a3784bb90d910b6b73dca0dc993d5c4844e46f49fddffcbfe6f467e1645d3"} Feb 19 21:47:03 crc kubenswrapper[4795]: I0219 21:47:03.231974 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.231957312 podStartE2EDuration="3.231957312s" podCreationTimestamp="2026-02-19 21:47:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:47:03.227915988 +0000 UTC m=+1134.420433852" watchObservedRunningTime="2026-02-19 21:47:03.231957312 +0000 UTC m=+1134.424475166" Feb 19 21:47:04 crc kubenswrapper[4795]: I0219 21:47:04.220559 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 21:47:05 crc kubenswrapper[4795]: I0219 21:47:05.823973 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:47:05 crc kubenswrapper[4795]: I0219 21:47:05.990441 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.057290 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-xfpr5"] Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.057569 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" podUID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" containerName="dnsmasq-dns" containerID="cri-o://af8b691ea9f453a749ca6ae641397eb75df993ab21c7f57160c14bfd356a318c" gracePeriod=10 Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.139135 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.206160 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.245413 4795 generic.go:334] "Generic (PLEG): container finished" podID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" containerID="af8b691ea9f453a749ca6ae641397eb75df993ab21c7f57160c14bfd356a318c" exitCode=0 Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.245487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" event={"ID":"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b","Type":"ContainerDied","Data":"af8b691ea9f453a749ca6ae641397eb75df993ab21c7f57160c14bfd356a318c"} Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.245617 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerName="cinder-scheduler" containerID="cri-o://749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad" gracePeriod=30 Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.245656 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerName="probe" containerID="cri-o://9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c" gracePeriod=30 Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.561550 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.655723 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49w8w\" (UniqueName: \"kubernetes.io/projected/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-kube-api-access-49w8w\") pod \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.655759 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-svc\") pod \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.655789 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-swift-storage-0\") pod \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.655871 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-sb\") pod \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.655997 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-nb\") pod \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.656016 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-config\") pod \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\" (UID: \"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b\") " Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.677045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-kube-api-access-49w8w" (OuterVolumeSpecName: "kube-api-access-49w8w") pod "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" (UID: "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b"). InnerVolumeSpecName "kube-api-access-49w8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.710932 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-config" (OuterVolumeSpecName: "config") pod "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" (UID: "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.719350 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" (UID: "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.731145 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" (UID: "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.743646 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" (UID: "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.748624 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" (UID: "6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.758473 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.758531 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.758542 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49w8w\" (UniqueName: \"kubernetes.io/projected/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-kube-api-access-49w8w\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.758552 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.758561 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:06 crc kubenswrapper[4795]: I0219 21:47:06.758569 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.256528 4795 generic.go:334] "Generic (PLEG): container finished" podID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerID="9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c" exitCode=0 Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.256607 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2384b48a-ae68-4495-9c68-2faf894de9f9","Type":"ContainerDied","Data":"9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c"} Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.258572 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" event={"ID":"6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b","Type":"ContainerDied","Data":"e42790fe679a04fe8e1a343d997c7b72c08e413899c7c4625b70ee788a6866db"} Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.258614 4795 scope.go:117] "RemoveContainer" containerID="af8b691ea9f453a749ca6ae641397eb75df993ab21c7f57160c14bfd356a318c" Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.258637 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-xfpr5" Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.287090 4795 scope.go:117] "RemoveContainer" containerID="81fbd576f753564d8cc96fbaca875356e50537cd709002f5a9b77dc1f35e7279" Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.290029 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-xfpr5"] Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.308782 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-xfpr5"] Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.525855 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" path="/var/lib/kubelet/pods/6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b/volumes" Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.693213 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:47:07 crc kubenswrapper[4795]: I0219 21:47:07.733867 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:47:09 crc kubenswrapper[4795]: I0219 21:47:09.625872 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:47:09 crc kubenswrapper[4795]: I0219 21:47:09.925607 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.014684 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-combined-ca-bundle\") pod \"2384b48a-ae68-4495-9c68-2faf894de9f9\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.014761 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5clv6\" (UniqueName: \"kubernetes.io/projected/2384b48a-ae68-4495-9c68-2faf894de9f9-kube-api-access-5clv6\") pod \"2384b48a-ae68-4495-9c68-2faf894de9f9\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.014802 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data-custom\") pod \"2384b48a-ae68-4495-9c68-2faf894de9f9\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.014855 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data\") pod \"2384b48a-ae68-4495-9c68-2faf894de9f9\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.014950 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-scripts\") pod \"2384b48a-ae68-4495-9c68-2faf894de9f9\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.014970 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2384b48a-ae68-4495-9c68-2faf894de9f9-etc-machine-id\") pod \"2384b48a-ae68-4495-9c68-2faf894de9f9\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.015415 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2384b48a-ae68-4495-9c68-2faf894de9f9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2384b48a-ae68-4495-9c68-2faf894de9f9" (UID: "2384b48a-ae68-4495-9c68-2faf894de9f9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.019959 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-scripts" (OuterVolumeSpecName: "scripts") pod "2384b48a-ae68-4495-9c68-2faf894de9f9" (UID: "2384b48a-ae68-4495-9c68-2faf894de9f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.020023 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2384b48a-ae68-4495-9c68-2faf894de9f9-kube-api-access-5clv6" (OuterVolumeSpecName: "kube-api-access-5clv6") pod "2384b48a-ae68-4495-9c68-2faf894de9f9" (UID: "2384b48a-ae68-4495-9c68-2faf894de9f9"). InnerVolumeSpecName "kube-api-access-5clv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.020364 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2384b48a-ae68-4495-9c68-2faf894de9f9" (UID: "2384b48a-ae68-4495-9c68-2faf894de9f9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.046593 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.121769 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2384b48a-ae68-4495-9c68-2faf894de9f9" (UID: "2384b48a-ae68-4495-9c68-2faf894de9f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.124901 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-combined-ca-bundle\") pod \"2384b48a-ae68-4495-9c68-2faf894de9f9\" (UID: \"2384b48a-ae68-4495-9c68-2faf894de9f9\") " Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.125532 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.125546 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2384b48a-ae68-4495-9c68-2faf894de9f9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.125565 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5clv6\" (UniqueName: \"kubernetes.io/projected/2384b48a-ae68-4495-9c68-2faf894de9f9-kube-api-access-5clv6\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.125574 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:10 crc kubenswrapper[4795]: W0219 21:47:10.126046 4795 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/2384b48a-ae68-4495-9c68-2faf894de9f9/volumes/kubernetes.io~secret/combined-ca-bundle Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.126056 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2384b48a-ae68-4495-9c68-2faf894de9f9" (UID: "2384b48a-ae68-4495-9c68-2faf894de9f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.127155 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78d7c97684-8rgnf"] Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.127362 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78d7c97684-8rgnf" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerName="neutron-api" containerID="cri-o://45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393" gracePeriod=30 Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.127821 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-78d7c97684-8rgnf" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerName="neutron-httpd" containerID="cri-o://2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9" gracePeriod=30 Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.187971 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data" (OuterVolumeSpecName: "config-data") pod "2384b48a-ae68-4495-9c68-2faf894de9f9" (UID: "2384b48a-ae68-4495-9c68-2faf894de9f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.226557 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.226583 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2384b48a-ae68-4495-9c68-2faf894de9f9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.293999 4795 generic.go:334] "Generic (PLEG): container finished" podID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerID="749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad" exitCode=0 Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.294075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2384b48a-ae68-4495-9c68-2faf894de9f9","Type":"ContainerDied","Data":"749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad"} Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.294112 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2384b48a-ae68-4495-9c68-2faf894de9f9","Type":"ContainerDied","Data":"1bb0109c274477813b9f48090bc2380bc85368bf7c3040d3068fd48ba9c4524c"} Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.294135 4795 scope.go:117] "RemoveContainer" containerID="9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.294330 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.307137 4795 generic.go:334] "Generic (PLEG): container finished" podID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerID="2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9" exitCode=0 Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.307203 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d7c97684-8rgnf" event={"ID":"2af40d0f-93fe-4592-a07b-0cee3eefbde5","Type":"ContainerDied","Data":"2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9"} Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.340665 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.361578 4795 scope.go:117] "RemoveContainer" containerID="749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.366580 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.396404 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:47:10 crc kubenswrapper[4795]: E0219 21:47:10.396752 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" containerName="dnsmasq-dns" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.396769 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" containerName="dnsmasq-dns" Feb 19 21:47:10 crc kubenswrapper[4795]: E0219 21:47:10.396785 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerName="probe" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.396792 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerName="probe" Feb 19 21:47:10 crc kubenswrapper[4795]: E0219 21:47:10.396811 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" containerName="init" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.396817 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" containerName="init" Feb 19 21:47:10 crc kubenswrapper[4795]: E0219 21:47:10.396829 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerName="cinder-scheduler" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.396835 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerName="cinder-scheduler" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.396990 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerName="cinder-scheduler" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.397006 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" containerName="probe" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.397020 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ae0e35c-6ee4-4e25-a76d-7033c2a3f09b" containerName="dnsmasq-dns" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.397866 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.405375 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.409292 4795 scope.go:117] "RemoveContainer" containerID="9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.409987 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:47:10 crc kubenswrapper[4795]: E0219 21:47:10.414138 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c\": container with ID starting with 9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c not found: ID does not exist" containerID="9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.414251 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c"} err="failed to get container status \"9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c\": rpc error: code = NotFound desc = could not find container \"9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c\": container with ID starting with 9e6688b1bae5051b5c2b4f82f1c19ba9f7ca8fd1dd09d8ac892122e2dcf35f0c not found: ID does not exist" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.414273 4795 scope.go:117] "RemoveContainer" containerID="749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad" Feb 19 21:47:10 crc kubenswrapper[4795]: E0219 21:47:10.418379 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad\": container with ID starting with 749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad not found: ID does not exist" containerID="749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.418437 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad"} err="failed to get container status \"749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad\": rpc error: code = NotFound desc = could not find container \"749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad\": container with ID starting with 749f08a4249525b1452b1f0b3c7c1bdacd64cc1b2a83d3c0d1cfa675182559ad not found: ID does not exist" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.439194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.439269 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c54f77a4-1095-4ff1-bc74-b845cde659d9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.439287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-scripts\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.439318 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.439366 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4bdc\" (UniqueName: \"kubernetes.io/projected/c54f77a4-1095-4ff1-bc74-b845cde659d9-kube-api-access-b4bdc\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.439407 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.543368 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.543474 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.543527 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c54f77a4-1095-4ff1-bc74-b845cde659d9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.543546 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-scripts\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.543580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.543632 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4bdc\" (UniqueName: \"kubernetes.io/projected/c54f77a4-1095-4ff1-bc74-b845cde659d9-kube-api-access-b4bdc\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.544272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c54f77a4-1095-4ff1-bc74-b845cde659d9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.549486 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-scripts\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.551099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.551895 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.552689 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.565660 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4bdc\" (UniqueName: \"kubernetes.io/projected/c54f77a4-1095-4ff1-bc74-b845cde659d9-kube-api-access-b4bdc\") pod \"cinder-scheduler-0\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " pod="openstack/cinder-scheduler-0" Feb 19 21:47:10 crc kubenswrapper[4795]: I0219 21:47:10.728583 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:47:11 crc kubenswrapper[4795]: W0219 21:47:11.350597 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc54f77a4_1095_4ff1_bc74_b845cde659d9.slice/crio-edafb0a067aa15654ff7c76ec822cff24dc2310ed79e9d7093d41cbc935fc540 WatchSource:0}: Error finding container edafb0a067aa15654ff7c76ec822cff24dc2310ed79e9d7093d41cbc935fc540: Status 404 returned error can't find the container with id edafb0a067aa15654ff7c76ec822cff24dc2310ed79e9d7093d41cbc935fc540 Feb 19 21:47:11 crc kubenswrapper[4795]: I0219 21:47:11.354040 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:47:11 crc kubenswrapper[4795]: I0219 21:47:11.357438 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:47:11 crc kubenswrapper[4795]: I0219 21:47:11.381096 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:47:11 crc kubenswrapper[4795]: I0219 21:47:11.450453 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76bfdcc9c4-d56mx"] Feb 19 21:47:11 crc kubenswrapper[4795]: I0219 21:47:11.454357 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-76bfdcc9c4-d56mx" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerName="placement-log" containerID="cri-o://07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec" gracePeriod=30 Feb 19 21:47:11 crc kubenswrapper[4795]: I0219 21:47:11.454534 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-76bfdcc9c4-d56mx" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerName="placement-api" containerID="cri-o://79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8" gracePeriod=30 Feb 19 21:47:11 crc kubenswrapper[4795]: I0219 21:47:11.531416 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2384b48a-ae68-4495-9c68-2faf894de9f9" path="/var/lib/kubelet/pods/2384b48a-ae68-4495-9c68-2faf894de9f9/volumes" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.116092 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.185940 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp5fp\" (UniqueName: \"kubernetes.io/projected/2af40d0f-93fe-4592-a07b-0cee3eefbde5-kube-api-access-kp5fp\") pod \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.186287 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-combined-ca-bundle\") pod \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.186350 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-httpd-config\") pod \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.186400 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-config\") pod \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.186590 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-ovndb-tls-certs\") pod \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\" (UID: \"2af40d0f-93fe-4592-a07b-0cee3eefbde5\") " Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.197349 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2af40d0f-93fe-4592-a07b-0cee3eefbde5" (UID: "2af40d0f-93fe-4592-a07b-0cee3eefbde5"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.198281 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af40d0f-93fe-4592-a07b-0cee3eefbde5-kube-api-access-kp5fp" (OuterVolumeSpecName: "kube-api-access-kp5fp") pod "2af40d0f-93fe-4592-a07b-0cee3eefbde5" (UID: "2af40d0f-93fe-4592-a07b-0cee3eefbde5"). InnerVolumeSpecName "kube-api-access-kp5fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.240359 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2af40d0f-93fe-4592-a07b-0cee3eefbde5" (UID: "2af40d0f-93fe-4592-a07b-0cee3eefbde5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.240912 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-config" (OuterVolumeSpecName: "config") pod "2af40d0f-93fe-4592-a07b-0cee3eefbde5" (UID: "2af40d0f-93fe-4592-a07b-0cee3eefbde5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.272443 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2af40d0f-93fe-4592-a07b-0cee3eefbde5" (UID: "2af40d0f-93fe-4592-a07b-0cee3eefbde5"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.288377 4795 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.288410 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp5fp\" (UniqueName: \"kubernetes.io/projected/2af40d0f-93fe-4592-a07b-0cee3eefbde5-kube-api-access-kp5fp\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.288424 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.288432 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.288442 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2af40d0f-93fe-4592-a07b-0cee3eefbde5-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.326682 4795 generic.go:334] "Generic (PLEG): container finished" podID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerID="45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393" exitCode=0 Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.326725 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d7c97684-8rgnf" event={"ID":"2af40d0f-93fe-4592-a07b-0cee3eefbde5","Type":"ContainerDied","Data":"45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393"} Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.326771 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-78d7c97684-8rgnf" event={"ID":"2af40d0f-93fe-4592-a07b-0cee3eefbde5","Type":"ContainerDied","Data":"0c9496366294be7a896bc72ec610b06b4b589bece135bb9b445591c9fe5a825f"} Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.326766 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-78d7c97684-8rgnf" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.326786 4795 scope.go:117] "RemoveContainer" containerID="2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.328976 4795 generic.go:334] "Generic (PLEG): container finished" podID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerID="07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec" exitCode=143 Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.329032 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76bfdcc9c4-d56mx" event={"ID":"bd5855e1-cadb-4170-8339-5f10945c6ce9","Type":"ContainerDied","Data":"07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec"} Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.333421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c54f77a4-1095-4ff1-bc74-b845cde659d9","Type":"ContainerStarted","Data":"5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8"} Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.333471 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c54f77a4-1095-4ff1-bc74-b845cde659d9","Type":"ContainerStarted","Data":"edafb0a067aa15654ff7c76ec822cff24dc2310ed79e9d7093d41cbc935fc540"} Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.359902 4795 scope.go:117] "RemoveContainer" containerID="45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.361707 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-78d7c97684-8rgnf"] Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.373393 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-78d7c97684-8rgnf"] Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.387641 4795 scope.go:117] "RemoveContainer" containerID="2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9" Feb 19 21:47:12 crc kubenswrapper[4795]: E0219 21:47:12.388052 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9\": container with ID starting with 2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9 not found: ID does not exist" containerID="2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.388082 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9"} err="failed to get container status \"2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9\": rpc error: code = NotFound desc = could not find container \"2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9\": container with ID starting with 2b75649a0876fc8b2fab0d3c547c6041ea1386b0e2f40ee0a4dc2afb93521ff9 not found: ID does not exist" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.388102 4795 scope.go:117] "RemoveContainer" containerID="45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393" Feb 19 21:47:12 crc kubenswrapper[4795]: E0219 21:47:12.388715 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393\": container with ID starting with 45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393 not found: ID does not exist" containerID="45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393" Feb 19 21:47:12 crc kubenswrapper[4795]: I0219 21:47:12.388736 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393"} err="failed to get container status \"45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393\": rpc error: code = NotFound desc = could not find container \"45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393\": container with ID starting with 45b6d7396950a55ef31e5b8bf7af7c2f0a2555905f99260630834dbe1b48c393 not found: ID does not exist" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.183757 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.345753 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c54f77a4-1095-4ff1-bc74-b845cde659d9","Type":"ContainerStarted","Data":"b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1"} Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.367752 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.367734398 podStartE2EDuration="3.367734398s" podCreationTimestamp="2026-02-19 21:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:47:13.367238394 +0000 UTC m=+1144.559756268" watchObservedRunningTime="2026-02-19 21:47:13.367734398 +0000 UTC m=+1144.560252262" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.524679 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" path="/var/lib/kubelet/pods/2af40d0f-93fe-4592-a07b-0cee3eefbde5/volumes" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.570856 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 21:47:13 crc kubenswrapper[4795]: E0219 21:47:13.571244 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerName="neutron-httpd" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.571266 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerName="neutron-httpd" Feb 19 21:47:13 crc kubenswrapper[4795]: E0219 21:47:13.571293 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerName="neutron-api" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.571302 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerName="neutron-api" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.571490 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerName="neutron-httpd" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.571508 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af40d0f-93fe-4592-a07b-0cee3eefbde5" containerName="neutron-api" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.572088 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.573793 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.573804 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-r9sqp" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.575778 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.581246 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.613878 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.613923 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-openstack-config-secret\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.614011 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/336beec4-e534-448f-8367-78645b53650e-openstack-config\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.614144 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbtv2\" (UniqueName: \"kubernetes.io/projected/336beec4-e534-448f-8367-78645b53650e-kube-api-access-bbtv2\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.722635 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/336beec4-e534-448f-8367-78645b53650e-openstack-config\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.722777 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbtv2\" (UniqueName: \"kubernetes.io/projected/336beec4-e534-448f-8367-78645b53650e-kube-api-access-bbtv2\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.722832 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.722858 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-openstack-config-secret\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.743286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/336beec4-e534-448f-8367-78645b53650e-openstack-config\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.754434 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.761251 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-openstack-config-secret\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.789599 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbtv2\" (UniqueName: \"kubernetes.io/projected/336beec4-e534-448f-8367-78645b53650e-kube-api-access-bbtv2\") pod \"openstackclient\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " pod="openstack/openstackclient" Feb 19 21:47:13 crc kubenswrapper[4795]: I0219 21:47:13.893634 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 21:47:14 crc kubenswrapper[4795]: I0219 21:47:14.395453 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.074157 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.151244 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdw57\" (UniqueName: \"kubernetes.io/projected/bd5855e1-cadb-4170-8339-5f10945c6ce9-kube-api-access-fdw57\") pod \"bd5855e1-cadb-4170-8339-5f10945c6ce9\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.151290 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-internal-tls-certs\") pod \"bd5855e1-cadb-4170-8339-5f10945c6ce9\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.151329 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5855e1-cadb-4170-8339-5f10945c6ce9-logs\") pod \"bd5855e1-cadb-4170-8339-5f10945c6ce9\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.151367 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-public-tls-certs\") pod \"bd5855e1-cadb-4170-8339-5f10945c6ce9\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.151494 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-combined-ca-bundle\") pod \"bd5855e1-cadb-4170-8339-5f10945c6ce9\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.151527 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-config-data\") pod \"bd5855e1-cadb-4170-8339-5f10945c6ce9\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.151630 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-scripts\") pod \"bd5855e1-cadb-4170-8339-5f10945c6ce9\" (UID: \"bd5855e1-cadb-4170-8339-5f10945c6ce9\") " Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.152915 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd5855e1-cadb-4170-8339-5f10945c6ce9-logs" (OuterVolumeSpecName: "logs") pod "bd5855e1-cadb-4170-8339-5f10945c6ce9" (UID: "bd5855e1-cadb-4170-8339-5f10945c6ce9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.165268 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-scripts" (OuterVolumeSpecName: "scripts") pod "bd5855e1-cadb-4170-8339-5f10945c6ce9" (UID: "bd5855e1-cadb-4170-8339-5f10945c6ce9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.193717 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd5855e1-cadb-4170-8339-5f10945c6ce9-kube-api-access-fdw57" (OuterVolumeSpecName: "kube-api-access-fdw57") pod "bd5855e1-cadb-4170-8339-5f10945c6ce9" (UID: "bd5855e1-cadb-4170-8339-5f10945c6ce9"). InnerVolumeSpecName "kube-api-access-fdw57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.221730 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd5855e1-cadb-4170-8339-5f10945c6ce9" (UID: "bd5855e1-cadb-4170-8339-5f10945c6ce9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.227214 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-config-data" (OuterVolumeSpecName: "config-data") pod "bd5855e1-cadb-4170-8339-5f10945c6ce9" (UID: "bd5855e1-cadb-4170-8339-5f10945c6ce9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.253428 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.253481 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.253493 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.253501 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdw57\" (UniqueName: \"kubernetes.io/projected/bd5855e1-cadb-4170-8339-5f10945c6ce9-kube-api-access-fdw57\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.253510 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd5855e1-cadb-4170-8339-5f10945c6ce9-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.272738 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bd5855e1-cadb-4170-8339-5f10945c6ce9" (UID: "bd5855e1-cadb-4170-8339-5f10945c6ce9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.314326 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bd5855e1-cadb-4170-8339-5f10945c6ce9" (UID: "bd5855e1-cadb-4170-8339-5f10945c6ce9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.355303 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.355342 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd5855e1-cadb-4170-8339-5f10945c6ce9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.361978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"336beec4-e534-448f-8367-78645b53650e","Type":"ContainerStarted","Data":"9f40a6e1a339f74b374579f38441616a24d07c91d67da5f54b0e5c6df69736a0"} Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.364075 4795 generic.go:334] "Generic (PLEG): container finished" podID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerID="79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8" exitCode=0 Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.364112 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76bfdcc9c4-d56mx" event={"ID":"bd5855e1-cadb-4170-8339-5f10945c6ce9","Type":"ContainerDied","Data":"79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8"} Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.364137 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76bfdcc9c4-d56mx" event={"ID":"bd5855e1-cadb-4170-8339-5f10945c6ce9","Type":"ContainerDied","Data":"fd2853bf461ad171b9d2aba20648d237de7c6fd1d48533d33bcc6a56c0d7fd46"} Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.364153 4795 scope.go:117] "RemoveContainer" containerID="79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.364208 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76bfdcc9c4-d56mx" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.397964 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76bfdcc9c4-d56mx"] Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.406100 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-76bfdcc9c4-d56mx"] Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.413296 4795 scope.go:117] "RemoveContainer" containerID="07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.458382 4795 scope.go:117] "RemoveContainer" containerID="79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8" Feb 19 21:47:15 crc kubenswrapper[4795]: E0219 21:47:15.458855 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8\": container with ID starting with 79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8 not found: ID does not exist" containerID="79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.458897 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8"} err="failed to get container status \"79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8\": rpc error: code = NotFound desc = could not find container \"79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8\": container with ID starting with 79eaa173c80318b05a89d976dd736c832c6c7ad22a55bf02162ef029353efbf8 not found: ID does not exist" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.458925 4795 scope.go:117] "RemoveContainer" containerID="07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec" Feb 19 21:47:15 crc kubenswrapper[4795]: E0219 21:47:15.459389 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec\": container with ID starting with 07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec not found: ID does not exist" containerID="07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.459431 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec"} err="failed to get container status \"07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec\": rpc error: code = NotFound desc = could not find container \"07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec\": container with ID starting with 07e0d10967942d332970ea7b450b84e11544b898580db0b6d739db3ac549d6ec not found: ID does not exist" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.521879 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" path="/var/lib/kubelet/pods/bd5855e1-cadb-4170-8339-5f10945c6ce9/volumes" Feb 19 21:47:15 crc kubenswrapper[4795]: I0219 21:47:15.729492 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.031813 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-858c4dcd57-whkj2"] Feb 19 21:47:16 crc kubenswrapper[4795]: E0219 21:47:16.032156 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerName="placement-log" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.032186 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerName="placement-log" Feb 19 21:47:16 crc kubenswrapper[4795]: E0219 21:47:16.032223 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerName="placement-api" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.032229 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerName="placement-api" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.032401 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerName="placement-log" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.032426 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5855e1-cadb-4170-8339-5f10945c6ce9" containerName="placement-api" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.038398 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.041132 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.041233 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.044029 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.048821 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-858c4dcd57-whkj2"] Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.172298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-combined-ca-bundle\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.172380 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-config-data\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.172409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-etc-swift\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.172433 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-log-httpd\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.172455 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wfxq\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-kube-api-access-2wfxq\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.172612 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-run-httpd\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.172669 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-internal-tls-certs\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.172835 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-public-tls-certs\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.274706 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-internal-tls-certs\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.274832 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-public-tls-certs\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.274855 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-combined-ca-bundle\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.274906 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-config-data\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.274935 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-etc-swift\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.274961 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-log-httpd\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.274986 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wfxq\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-kube-api-access-2wfxq\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.275027 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-run-httpd\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.275484 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-log-httpd\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.275514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-run-httpd\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.279585 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-public-tls-certs\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.288841 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-internal-tls-certs\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.288894 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-config-data\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.289321 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-combined-ca-bundle\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.289374 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-etc-swift\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.292012 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wfxq\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-kube-api-access-2wfxq\") pod \"swift-proxy-858c4dcd57-whkj2\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:16 crc kubenswrapper[4795]: I0219 21:47:16.353676 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.057688 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-858c4dcd57-whkj2"] Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.091711 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.091949 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="ceilometer-central-agent" containerID="cri-o://f5d643cd66b59a6396d5feca0a7263f30adfef3399f6c9e0364f6fef62658bb6" gracePeriod=30 Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.095569 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="sg-core" containerID="cri-o://a592134b61717d57f63788206b6ae0f1532d3c76a4a60eee8a3ddfb80940ab9a" gracePeriod=30 Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.095647 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="proxy-httpd" containerID="cri-o://d19bddd1f9142f8194c0a0050a40c02459b91997d36600ebdae2e82529e31c8c" gracePeriod=30 Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.095709 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="ceilometer-notification-agent" containerID="cri-o://fdbe214ee22443ca571dd54e58e45a8266df69233f71d59ca89e8fe90ae2ab86" gracePeriod=30 Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.239303 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.421834 4795 generic.go:334] "Generic (PLEG): container finished" podID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerID="a592134b61717d57f63788206b6ae0f1532d3c76a4a60eee8a3ddfb80940ab9a" exitCode=2 Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.421912 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerDied","Data":"a592134b61717d57f63788206b6ae0f1532d3c76a4a60eee8a3ddfb80940ab9a"} Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.426569 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-858c4dcd57-whkj2" event={"ID":"da2e3f89-bf0b-4371-8e5b-a0037f266c70","Type":"ContainerStarted","Data":"38273143a291266be6dd29c71788a99ae4aa366ccd575844722fcc6687631e66"} Feb 19 21:47:17 crc kubenswrapper[4795]: I0219 21:47:17.426606 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-858c4dcd57-whkj2" event={"ID":"da2e3f89-bf0b-4371-8e5b-a0037f266c70","Type":"ContainerStarted","Data":"fd10d4f85e04ded895f7718dd53443f09a3be089bf6f4718e6d017852d997436"} Feb 19 21:47:18 crc kubenswrapper[4795]: I0219 21:47:18.711356 4795 generic.go:334] "Generic (PLEG): container finished" podID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerID="d19bddd1f9142f8194c0a0050a40c02459b91997d36600ebdae2e82529e31c8c" exitCode=0 Feb 19 21:47:18 crc kubenswrapper[4795]: I0219 21:47:18.711608 4795 generic.go:334] "Generic (PLEG): container finished" podID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerID="f5d643cd66b59a6396d5feca0a7263f30adfef3399f6c9e0364f6fef62658bb6" exitCode=0 Feb 19 21:47:18 crc kubenswrapper[4795]: I0219 21:47:18.711648 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerDied","Data":"d19bddd1f9142f8194c0a0050a40c02459b91997d36600ebdae2e82529e31c8c"} Feb 19 21:47:18 crc kubenswrapper[4795]: I0219 21:47:18.711674 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerDied","Data":"f5d643cd66b59a6396d5feca0a7263f30adfef3399f6c9e0364f6fef62658bb6"} Feb 19 21:47:18 crc kubenswrapper[4795]: I0219 21:47:18.718132 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-858c4dcd57-whkj2" event={"ID":"da2e3f89-bf0b-4371-8e5b-a0037f266c70","Type":"ContainerStarted","Data":"812bd35c706001e18c0da5e2c3dd17a059f42e984bdd2e12b92f8fd91195b1e2"} Feb 19 21:47:18 crc kubenswrapper[4795]: I0219 21:47:18.719390 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:18 crc kubenswrapper[4795]: I0219 21:47:18.719415 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:18 crc kubenswrapper[4795]: I0219 21:47:18.744740 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-858c4dcd57-whkj2" podStartSLOduration=2.744714763 podStartE2EDuration="2.744714763s" podCreationTimestamp="2026-02-19 21:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:47:18.734153835 +0000 UTC m=+1149.926671699" watchObservedRunningTime="2026-02-19 21:47:18.744714763 +0000 UTC m=+1149.937232627" Feb 19 21:47:20 crc kubenswrapper[4795]: I0219 21:47:20.750718 4795 generic.go:334] "Generic (PLEG): container finished" podID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerID="fdbe214ee22443ca571dd54e58e45a8266df69233f71d59ca89e8fe90ae2ab86" exitCode=0 Feb 19 21:47:20 crc kubenswrapper[4795]: I0219 21:47:20.750788 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerDied","Data":"fdbe214ee22443ca571dd54e58e45a8266df69233f71d59ca89e8fe90ae2ab86"} Feb 19 21:47:21 crc kubenswrapper[4795]: I0219 21:47:21.039750 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.011833 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-r8v4f"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.013469 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.036624 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-r8v4f"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.099805 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-h72xz"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.100990 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.109777 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2d62-account-create-update-jrx2c"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.110968 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.114478 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.127348 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/573a7aa5-43d9-4523-8eea-4c1a36da49fb-operator-scripts\") pod \"nova-api-db-create-r8v4f\" (UID: \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\") " pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.127394 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhtwv\" (UniqueName: \"kubernetes.io/projected/573a7aa5-43d9-4523-8eea-4c1a36da49fb-kube-api-access-vhtwv\") pod \"nova-api-db-create-r8v4f\" (UID: \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\") " pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.132934 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-h72xz"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.145023 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2d62-account-create-update-jrx2c"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.228871 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d85454-a8db-47bc-b616-bbdb4f6d8920-operator-scripts\") pod \"nova-cell0-db-create-h72xz\" (UID: \"29d85454-a8db-47bc-b616-bbdb4f6d8920\") " pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.228961 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqs8n\" (UniqueName: \"kubernetes.io/projected/29d85454-a8db-47bc-b616-bbdb4f6d8920-kube-api-access-dqs8n\") pod \"nova-cell0-db-create-h72xz\" (UID: \"29d85454-a8db-47bc-b616-bbdb4f6d8920\") " pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.229302 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79mth\" (UniqueName: \"kubernetes.io/projected/282595b2-0eaa-4deb-9af4-288241817325-kube-api-access-79mth\") pod \"nova-api-2d62-account-create-update-jrx2c\" (UID: \"282595b2-0eaa-4deb-9af4-288241817325\") " pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.229702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/573a7aa5-43d9-4523-8eea-4c1a36da49fb-operator-scripts\") pod \"nova-api-db-create-r8v4f\" (UID: \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\") " pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.229731 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/282595b2-0eaa-4deb-9af4-288241817325-operator-scripts\") pod \"nova-api-2d62-account-create-update-jrx2c\" (UID: \"282595b2-0eaa-4deb-9af4-288241817325\") " pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.229758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhtwv\" (UniqueName: \"kubernetes.io/projected/573a7aa5-43d9-4523-8eea-4c1a36da49fb-kube-api-access-vhtwv\") pod \"nova-api-db-create-r8v4f\" (UID: \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\") " pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.230420 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/573a7aa5-43d9-4523-8eea-4c1a36da49fb-operator-scripts\") pod \"nova-api-db-create-r8v4f\" (UID: \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\") " pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.254002 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhtwv\" (UniqueName: \"kubernetes.io/projected/573a7aa5-43d9-4523-8eea-4c1a36da49fb-kube-api-access-vhtwv\") pod \"nova-api-db-create-r8v4f\" (UID: \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\") " pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.307029 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-p7s8n"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.308460 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.331525 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-operator-scripts\") pod \"nova-cell1-db-create-p7s8n\" (UID: \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\") " pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.331576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/282595b2-0eaa-4deb-9af4-288241817325-operator-scripts\") pod \"nova-api-2d62-account-create-update-jrx2c\" (UID: \"282595b2-0eaa-4deb-9af4-288241817325\") " pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.331670 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d85454-a8db-47bc-b616-bbdb4f6d8920-operator-scripts\") pod \"nova-cell0-db-create-h72xz\" (UID: \"29d85454-a8db-47bc-b616-bbdb4f6d8920\") " pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.331704 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4q9l\" (UniqueName: \"kubernetes.io/projected/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-kube-api-access-d4q9l\") pod \"nova-cell1-db-create-p7s8n\" (UID: \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\") " pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.331760 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqs8n\" (UniqueName: \"kubernetes.io/projected/29d85454-a8db-47bc-b616-bbdb4f6d8920-kube-api-access-dqs8n\") pod \"nova-cell0-db-create-h72xz\" (UID: \"29d85454-a8db-47bc-b616-bbdb4f6d8920\") " pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.331784 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79mth\" (UniqueName: \"kubernetes.io/projected/282595b2-0eaa-4deb-9af4-288241817325-kube-api-access-79mth\") pod \"nova-api-2d62-account-create-update-jrx2c\" (UID: \"282595b2-0eaa-4deb-9af4-288241817325\") " pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.332531 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/282595b2-0eaa-4deb-9af4-288241817325-operator-scripts\") pod \"nova-api-2d62-account-create-update-jrx2c\" (UID: \"282595b2-0eaa-4deb-9af4-288241817325\") " pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.332612 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d85454-a8db-47bc-b616-bbdb4f6d8920-operator-scripts\") pod \"nova-cell0-db-create-h72xz\" (UID: \"29d85454-a8db-47bc-b616-bbdb4f6d8920\") " pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.339244 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.339584 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-p7s8n"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.360500 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79mth\" (UniqueName: \"kubernetes.io/projected/282595b2-0eaa-4deb-9af4-288241817325-kube-api-access-79mth\") pod \"nova-api-2d62-account-create-update-jrx2c\" (UID: \"282595b2-0eaa-4deb-9af4-288241817325\") " pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.361871 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.363636 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.370646 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-1922-account-create-update-bnqt2"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.374305 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.375329 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqs8n\" (UniqueName: \"kubernetes.io/projected/29d85454-a8db-47bc-b616-bbdb4f6d8920-kube-api-access-dqs8n\") pod \"nova-cell0-db-create-h72xz\" (UID: \"29d85454-a8db-47bc-b616-bbdb4f6d8920\") " pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.379150 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.391720 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-bnqt2"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.423211 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.433632 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-operator-scripts\") pod \"nova-cell1-db-create-p7s8n\" (UID: \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\") " pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.433765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4q9l\" (UniqueName: \"kubernetes.io/projected/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-kube-api-access-d4q9l\") pod \"nova-cell1-db-create-p7s8n\" (UID: \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\") " pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.436245 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-operator-scripts\") pod \"nova-cell1-db-create-p7s8n\" (UID: \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\") " pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.442153 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.467150 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4q9l\" (UniqueName: \"kubernetes.io/projected/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-kube-api-access-d4q9l\") pod \"nova-cell1-db-create-p7s8n\" (UID: \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\") " pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.517773 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-48v7f"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.518969 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.523656 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.534126 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-48v7f"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.537450 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkn9b\" (UniqueName: \"kubernetes.io/projected/1946f4fd-5254-4e66-8739-5a51af23e963-kube-api-access-dkn9b\") pod \"nova-cell0-1922-account-create-update-bnqt2\" (UID: \"1946f4fd-5254-4e66-8739-5a51af23e963\") " pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.537552 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1946f4fd-5254-4e66-8739-5a51af23e963-operator-scripts\") pod \"nova-cell0-1922-account-create-update-bnqt2\" (UID: \"1946f4fd-5254-4e66-8739-5a51af23e963\") " pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.631645 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.639321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvxgl\" (UniqueName: \"kubernetes.io/projected/6952c796-d85e-49b3-b931-60966311a0c0-kube-api-access-jvxgl\") pod \"nova-cell1-e48f-account-create-update-48v7f\" (UID: \"6952c796-d85e-49b3-b931-60966311a0c0\") " pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.639410 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6952c796-d85e-49b3-b931-60966311a0c0-operator-scripts\") pod \"nova-cell1-e48f-account-create-update-48v7f\" (UID: \"6952c796-d85e-49b3-b931-60966311a0c0\") " pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.639524 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkn9b\" (UniqueName: \"kubernetes.io/projected/1946f4fd-5254-4e66-8739-5a51af23e963-kube-api-access-dkn9b\") pod \"nova-cell0-1922-account-create-update-bnqt2\" (UID: \"1946f4fd-5254-4e66-8739-5a51af23e963\") " pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.639542 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1946f4fd-5254-4e66-8739-5a51af23e963-operator-scripts\") pod \"nova-cell0-1922-account-create-update-bnqt2\" (UID: \"1946f4fd-5254-4e66-8739-5a51af23e963\") " pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.640325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1946f4fd-5254-4e66-8739-5a51af23e963-operator-scripts\") pod \"nova-cell0-1922-account-create-update-bnqt2\" (UID: \"1946f4fd-5254-4e66-8739-5a51af23e963\") " pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.672291 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkn9b\" (UniqueName: \"kubernetes.io/projected/1946f4fd-5254-4e66-8739-5a51af23e963-kube-api-access-dkn9b\") pod \"nova-cell0-1922-account-create-update-bnqt2\" (UID: \"1946f4fd-5254-4e66-8739-5a51af23e963\") " pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.734189 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.734412 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-log" containerID="cri-o://7826593332e6fe0d625ec77b566a78383702574fe609c8ca89088f745857f981" gracePeriod=30 Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.734582 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-httpd" containerID="cri-o://32a8f97c4bdeeea5fdb4b24c48b5ee8e3dda515976e342ae4939a42ab5261eec" gracePeriod=30 Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.741971 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvxgl\" (UniqueName: \"kubernetes.io/projected/6952c796-d85e-49b3-b931-60966311a0c0-kube-api-access-jvxgl\") pod \"nova-cell1-e48f-account-create-update-48v7f\" (UID: \"6952c796-d85e-49b3-b931-60966311a0c0\") " pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.742496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6952c796-d85e-49b3-b931-60966311a0c0-operator-scripts\") pod \"nova-cell1-e48f-account-create-update-48v7f\" (UID: \"6952c796-d85e-49b3-b931-60966311a0c0\") " pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.743369 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6952c796-d85e-49b3-b931-60966311a0c0-operator-scripts\") pod \"nova-cell1-e48f-account-create-update-48v7f\" (UID: \"6952c796-d85e-49b3-b931-60966311a0c0\") " pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.768636 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvxgl\" (UniqueName: \"kubernetes.io/projected/6952c796-d85e-49b3-b931-60966311a0c0-kube-api-access-jvxgl\") pod \"nova-cell1-e48f-account-create-update-48v7f\" (UID: \"6952c796-d85e-49b3-b931-60966311a0c0\") " pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.850676 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:26 crc kubenswrapper[4795]: I0219 21:47:26.860139 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.110468 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.268284 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-sg-core-conf-yaml\") pod \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.268470 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-config-data\") pod \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.268639 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-log-httpd\") pod \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.268944 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-combined-ca-bundle\") pod \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.269059 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-scripts\") pod \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.269101 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59wzg\" (UniqueName: \"kubernetes.io/projected/d2b418ec-23ae-4edd-8e61-0522a69c6be4-kube-api-access-59wzg\") pod \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.269125 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-run-httpd\") pod \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\" (UID: \"d2b418ec-23ae-4edd-8e61-0522a69c6be4\") " Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.270132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d2b418ec-23ae-4edd-8e61-0522a69c6be4" (UID: "d2b418ec-23ae-4edd-8e61-0522a69c6be4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.272279 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d2b418ec-23ae-4edd-8e61-0522a69c6be4" (UID: "d2b418ec-23ae-4edd-8e61-0522a69c6be4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.280847 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2b418ec-23ae-4edd-8e61-0522a69c6be4-kube-api-access-59wzg" (OuterVolumeSpecName: "kube-api-access-59wzg") pod "d2b418ec-23ae-4edd-8e61-0522a69c6be4" (UID: "d2b418ec-23ae-4edd-8e61-0522a69c6be4"). InnerVolumeSpecName "kube-api-access-59wzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.286408 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-scripts" (OuterVolumeSpecName: "scripts") pod "d2b418ec-23ae-4edd-8e61-0522a69c6be4" (UID: "d2b418ec-23ae-4edd-8e61-0522a69c6be4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.328105 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-r8v4f"] Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.331055 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d2b418ec-23ae-4edd-8e61-0522a69c6be4" (UID: "d2b418ec-23ae-4edd-8e61-0522a69c6be4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:27 crc kubenswrapper[4795]: W0219 21:47:27.342902 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod282595b2_0eaa_4deb_9af4_288241817325.slice/crio-f5a89b0c123f3a7776b51e9bb99d6cb38a702a838a7a412b9f1def886d685c88 WatchSource:0}: Error finding container f5a89b0c123f3a7776b51e9bb99d6cb38a702a838a7a412b9f1def886d685c88: Status 404 returned error can't find the container with id f5a89b0c123f3a7776b51e9bb99d6cb38a702a838a7a412b9f1def886d685c88 Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.370991 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.371099 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.371154 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59wzg\" (UniqueName: \"kubernetes.io/projected/d2b418ec-23ae-4edd-8e61-0522a69c6be4-kube-api-access-59wzg\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.372016 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2b418ec-23ae-4edd-8e61-0522a69c6be4-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.372048 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.397933 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-h72xz"] Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.404617 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2d62-account-create-update-jrx2c"] Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.425396 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-config-data" (OuterVolumeSpecName: "config-data") pod "d2b418ec-23ae-4edd-8e61-0522a69c6be4" (UID: "d2b418ec-23ae-4edd-8e61-0522a69c6be4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.434134 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2b418ec-23ae-4edd-8e61-0522a69c6be4" (UID: "d2b418ec-23ae-4edd-8e61-0522a69c6be4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.458415 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-p7s8n"] Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.480031 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.480059 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2b418ec-23ae-4edd-8e61-0522a69c6be4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.624649 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-bnqt2"] Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.649958 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-48v7f"] Feb 19 21:47:27 crc kubenswrapper[4795]: W0219 21:47:27.699916 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6952c796_d85e_49b3_b931_60966311a0c0.slice/crio-0748dc0aa50d4d39dfbdabcf9ed72680e0e959eb3eb965f01e22779a3df14276 WatchSource:0}: Error finding container 0748dc0aa50d4d39dfbdabcf9ed72680e0e959eb3eb965f01e22779a3df14276: Status 404 returned error can't find the container with id 0748dc0aa50d4d39dfbdabcf9ed72680e0e959eb3eb965f01e22779a3df14276 Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.819647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p7s8n" event={"ID":"3778f66e-fd7f-4af5-ae3e-2a7c272785a0","Type":"ContainerStarted","Data":"68df0b7ecf93ce028557646109c36262df3e9ce4a929fd966aed1e0fe5f5daff"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.820699 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"336beec4-e534-448f-8367-78645b53650e","Type":"ContainerStarted","Data":"6f5fd7fb0db869022abdd3586a7debaf90502df56c7437b86541c8afbbd3687a"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.825909 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2b418ec-23ae-4edd-8e61-0522a69c6be4","Type":"ContainerDied","Data":"aca75b960d03859bf274a8a3797de1abbc9baeeee38219c6dcc1576889b95f11"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.825936 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.825963 4795 scope.go:117] "RemoveContainer" containerID="d19bddd1f9142f8194c0a0050a40c02459b91997d36600ebdae2e82529e31c8c" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.842857 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.490945388 podStartE2EDuration="14.842838503s" podCreationTimestamp="2026-02-19 21:47:13 +0000 UTC" firstStartedPulling="2026-02-19 21:47:14.406319799 +0000 UTC m=+1145.598837663" lastFinishedPulling="2026-02-19 21:47:26.758212914 +0000 UTC m=+1157.950730778" observedRunningTime="2026-02-19 21:47:27.842260237 +0000 UTC m=+1159.034778101" watchObservedRunningTime="2026-02-19 21:47:27.842838503 +0000 UTC m=+1159.035356367" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.869215 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.878628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r8v4f" event={"ID":"573a7aa5-43d9-4523-8eea-4c1a36da49fb","Type":"ContainerStarted","Data":"b1129ce7375e075c1c1844910d66cfc1b6308074f1d6f73dea4c3d974a9f4054"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.885243 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.899981 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:27 crc kubenswrapper[4795]: E0219 21:47:27.900453 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="proxy-httpd" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.900487 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="proxy-httpd" Feb 19 21:47:27 crc kubenswrapper[4795]: E0219 21:47:27.900515 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="ceilometer-central-agent" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.900521 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="ceilometer-central-agent" Feb 19 21:47:27 crc kubenswrapper[4795]: E0219 21:47:27.900551 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="ceilometer-notification-agent" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.900558 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="ceilometer-notification-agent" Feb 19 21:47:27 crc kubenswrapper[4795]: E0219 21:47:27.900568 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="sg-core" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.900573 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="sg-core" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.900796 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="sg-core" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.900817 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="proxy-httpd" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.900834 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="ceilometer-central-agent" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.900842 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" containerName="ceilometer-notification-agent" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.902947 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.905432 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.905528 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerID="7826593332e6fe0d625ec77b566a78383702574fe609c8ca89088f745857f981" exitCode=143 Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.905659 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac1caac2-edf5-453d-a76d-e1c65b7f038b","Type":"ContainerDied","Data":"7826593332e6fe0d625ec77b566a78383702574fe609c8ca89088f745857f981"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.905713 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.908690 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e48f-account-create-update-48v7f" event={"ID":"6952c796-d85e-49b3-b931-60966311a0c0","Type":"ContainerStarted","Data":"0748dc0aa50d4d39dfbdabcf9ed72680e0e959eb3eb965f01e22779a3df14276"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.913106 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.930009 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-h72xz" event={"ID":"29d85454-a8db-47bc-b616-bbdb4f6d8920","Type":"ContainerStarted","Data":"af2bb649161fdcaac08f1bb77ef6e41cc4143b0e27292cc1c3c0624b994df5bf"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.940854 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2d62-account-create-update-jrx2c" event={"ID":"282595b2-0eaa-4deb-9af4-288241817325","Type":"ContainerStarted","Data":"f5a89b0c123f3a7776b51e9bb99d6cb38a702a838a7a412b9f1def886d685c88"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.942195 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1922-account-create-update-bnqt2" event={"ID":"1946f4fd-5254-4e66-8739-5a51af23e963","Type":"ContainerStarted","Data":"1c20c0048ca6ea52ec70d130bc06285f1a2ec973f63fcbb08a3874e51e60bfaa"} Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.949000 4795 scope.go:117] "RemoveContainer" containerID="a592134b61717d57f63788206b6ae0f1532d3c76a4a60eee8a3ddfb80940ab9a" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.976264 4795 scope.go:117] "RemoveContainer" containerID="fdbe214ee22443ca571dd54e58e45a8266df69233f71d59ca89e8fe90ae2ab86" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.996849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-scripts\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.996887 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqzt6\" (UniqueName: \"kubernetes.io/projected/253d2f67-fdba-4a38-9b30-8544e6e54cc4-kube-api-access-hqzt6\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.996998 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-config-data\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.997024 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.997042 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-run-httpd\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.997057 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-log-httpd\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:27 crc kubenswrapper[4795]: I0219 21:47:27.997070 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.010318 4795 scope.go:117] "RemoveContainer" containerID="f5d643cd66b59a6396d5feca0a7263f30adfef3399f6c9e0364f6fef62658bb6" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.098362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-config-data\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.098411 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.098431 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-run-httpd\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.098446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-log-httpd\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.098459 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.098511 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-scripts\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.098535 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqzt6\" (UniqueName: \"kubernetes.io/projected/253d2f67-fdba-4a38-9b30-8544e6e54cc4-kube-api-access-hqzt6\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.099053 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-log-httpd\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.099447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-run-httpd\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.105020 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.106590 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.106862 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-config-data\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.109029 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-scripts\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.122107 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqzt6\" (UniqueName: \"kubernetes.io/projected/253d2f67-fdba-4a38-9b30-8544e6e54cc4-kube-api-access-hqzt6\") pod \"ceilometer-0\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.231191 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.427282 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.430770 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.430827 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.442128 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"26a524895f2f97a5543d7713a3a6fad00cc54e588f3f27507aef436c0255d593"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.442209 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://26a524895f2f97a5543d7713a3a6fad00cc54e588f3f27507aef436c0255d593" gracePeriod=600 Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.753508 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.952202 4795 generic.go:334] "Generic (PLEG): container finished" podID="6952c796-d85e-49b3-b931-60966311a0c0" containerID="df968256162833d5078e440bc65555f6f0195c60776c40f5962f3cbd9e8c0552" exitCode=0 Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.952287 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e48f-account-create-update-48v7f" event={"ID":"6952c796-d85e-49b3-b931-60966311a0c0","Type":"ContainerDied","Data":"df968256162833d5078e440bc65555f6f0195c60776c40f5962f3cbd9e8c0552"} Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.954055 4795 generic.go:334] "Generic (PLEG): container finished" podID="29d85454-a8db-47bc-b616-bbdb4f6d8920" containerID="a0f7c3f34c80b9a269fbc6418536c8bd187965bff2a033810a317f2452cc049c" exitCode=0 Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.954175 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-h72xz" event={"ID":"29d85454-a8db-47bc-b616-bbdb4f6d8920","Type":"ContainerDied","Data":"a0f7c3f34c80b9a269fbc6418536c8bd187965bff2a033810a317f2452cc049c"} Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.955698 4795 generic.go:334] "Generic (PLEG): container finished" podID="282595b2-0eaa-4deb-9af4-288241817325" containerID="5c8a8280c91cc71f3e85d0b7e361ab76b49d5efc392b87ec1bb737d4336102fe" exitCode=0 Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.955721 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2d62-account-create-update-jrx2c" event={"ID":"282595b2-0eaa-4deb-9af4-288241817325","Type":"ContainerDied","Data":"5c8a8280c91cc71f3e85d0b7e361ab76b49d5efc392b87ec1bb737d4336102fe"} Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.958058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerStarted","Data":"84b0bcbc0062b2eec5eb90cbad2c6d5b12462c44819f0b7f936e0b7ddb57186c"} Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.960593 4795 generic.go:334] "Generic (PLEG): container finished" podID="3778f66e-fd7f-4af5-ae3e-2a7c272785a0" containerID="e324028f2d7a41155077f14e0d48b2d58c21ebdbf00e4a7e4bd8a8141187b6bd" exitCode=0 Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.960632 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p7s8n" event={"ID":"3778f66e-fd7f-4af5-ae3e-2a7c272785a0","Type":"ContainerDied","Data":"e324028f2d7a41155077f14e0d48b2d58c21ebdbf00e4a7e4bd8a8141187b6bd"} Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.964649 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="26a524895f2f97a5543d7713a3a6fad00cc54e588f3f27507aef436c0255d593" exitCode=0 Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.964709 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"26a524895f2f97a5543d7713a3a6fad00cc54e588f3f27507aef436c0255d593"} Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.964741 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"d1fe8d148e55484c7e32b6632ef0256602ab969af6bc815e1058a95087794811"} Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.964755 4795 scope.go:117] "RemoveContainer" containerID="baca31a8ff8f8b420ab1c2fee031dade1a9efccb0543c74090535aa06f41da2f" Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.966666 4795 generic.go:334] "Generic (PLEG): container finished" podID="573a7aa5-43d9-4523-8eea-4c1a36da49fb" containerID="a5072d3cb490beed9fbcf82cc94732aed7e308348a7a93463173543a97f32666" exitCode=0 Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.966736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r8v4f" event={"ID":"573a7aa5-43d9-4523-8eea-4c1a36da49fb","Type":"ContainerDied","Data":"a5072d3cb490beed9fbcf82cc94732aed7e308348a7a93463173543a97f32666"} Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.968653 4795 generic.go:334] "Generic (PLEG): container finished" podID="1946f4fd-5254-4e66-8739-5a51af23e963" containerID="c82c86becfd7208e347af805b27fa40c1a5698022b6509cd540c7832c74ea578" exitCode=0 Feb 19 21:47:28 crc kubenswrapper[4795]: I0219 21:47:28.968711 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1922-account-create-update-bnqt2" event={"ID":"1946f4fd-5254-4e66-8739-5a51af23e963","Type":"ContainerDied","Data":"c82c86becfd7208e347af805b27fa40c1a5698022b6509cd540c7832c74ea578"} Feb 19 21:47:29 crc kubenswrapper[4795]: I0219 21:47:29.524019 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2b418ec-23ae-4edd-8e61-0522a69c6be4" path="/var/lib/kubelet/pods/d2b418ec-23ae-4edd-8e61-0522a69c6be4/volumes" Feb 19 21:47:29 crc kubenswrapper[4795]: I0219 21:47:29.892129 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:47:29 crc kubenswrapper[4795]: I0219 21:47:29.892440 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-log" containerID="cri-o://1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a" gracePeriod=30 Feb 19 21:47:29 crc kubenswrapper[4795]: I0219 21:47:29.892962 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-httpd" containerID="cri-o://24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d" gracePeriod=30 Feb 19 21:47:29 crc kubenswrapper[4795]: I0219 21:47:29.981735 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerStarted","Data":"92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5"} Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.485424 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": dial tcp 10.217.0.150:9292: connect: connection refused" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.485449 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": dial tcp 10.217.0.150:9292: connect: connection refused" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.558840 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.664454 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/573a7aa5-43d9-4523-8eea-4c1a36da49fb-operator-scripts\") pod \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\" (UID: \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.664562 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhtwv\" (UniqueName: \"kubernetes.io/projected/573a7aa5-43d9-4523-8eea-4c1a36da49fb-kube-api-access-vhtwv\") pod \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\" (UID: \"573a7aa5-43d9-4523-8eea-4c1a36da49fb\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.665397 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/573a7aa5-43d9-4523-8eea-4c1a36da49fb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "573a7aa5-43d9-4523-8eea-4c1a36da49fb" (UID: "573a7aa5-43d9-4523-8eea-4c1a36da49fb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.690411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/573a7aa5-43d9-4523-8eea-4c1a36da49fb-kube-api-access-vhtwv" (OuterVolumeSpecName: "kube-api-access-vhtwv") pod "573a7aa5-43d9-4523-8eea-4c1a36da49fb" (UID: "573a7aa5-43d9-4523-8eea-4c1a36da49fb"). InnerVolumeSpecName "kube-api-access-vhtwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.767783 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/573a7aa5-43d9-4523-8eea-4c1a36da49fb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.767819 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhtwv\" (UniqueName: \"kubernetes.io/projected/573a7aa5-43d9-4523-8eea-4c1a36da49fb-kube-api-access-vhtwv\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.797525 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.817544 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.821692 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.824022 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.830900 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.970962 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4q9l\" (UniqueName: \"kubernetes.io/projected/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-kube-api-access-d4q9l\") pod \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\" (UID: \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.971021 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79mth\" (UniqueName: \"kubernetes.io/projected/282595b2-0eaa-4deb-9af4-288241817325-kube-api-access-79mth\") pod \"282595b2-0eaa-4deb-9af4-288241817325\" (UID: \"282595b2-0eaa-4deb-9af4-288241817325\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.971060 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqs8n\" (UniqueName: \"kubernetes.io/projected/29d85454-a8db-47bc-b616-bbdb4f6d8920-kube-api-access-dqs8n\") pod \"29d85454-a8db-47bc-b616-bbdb4f6d8920\" (UID: \"29d85454-a8db-47bc-b616-bbdb4f6d8920\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.971083 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvxgl\" (UniqueName: \"kubernetes.io/projected/6952c796-d85e-49b3-b931-60966311a0c0-kube-api-access-jvxgl\") pod \"6952c796-d85e-49b3-b931-60966311a0c0\" (UID: \"6952c796-d85e-49b3-b931-60966311a0c0\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.971219 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6952c796-d85e-49b3-b931-60966311a0c0-operator-scripts\") pod \"6952c796-d85e-49b3-b931-60966311a0c0\" (UID: \"6952c796-d85e-49b3-b931-60966311a0c0\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.972035 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/282595b2-0eaa-4deb-9af4-288241817325-operator-scripts\") pod \"282595b2-0eaa-4deb-9af4-288241817325\" (UID: \"282595b2-0eaa-4deb-9af4-288241817325\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.972075 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d85454-a8db-47bc-b616-bbdb4f6d8920-operator-scripts\") pod \"29d85454-a8db-47bc-b616-bbdb4f6d8920\" (UID: \"29d85454-a8db-47bc-b616-bbdb4f6d8920\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.972475 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6952c796-d85e-49b3-b931-60966311a0c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6952c796-d85e-49b3-b931-60966311a0c0" (UID: "6952c796-d85e-49b3-b931-60966311a0c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.972526 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/282595b2-0eaa-4deb-9af4-288241817325-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "282595b2-0eaa-4deb-9af4-288241817325" (UID: "282595b2-0eaa-4deb-9af4-288241817325"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.972648 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1946f4fd-5254-4e66-8739-5a51af23e963-operator-scripts\") pod \"1946f4fd-5254-4e66-8739-5a51af23e963\" (UID: \"1946f4fd-5254-4e66-8739-5a51af23e963\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.972678 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29d85454-a8db-47bc-b616-bbdb4f6d8920-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29d85454-a8db-47bc-b616-bbdb4f6d8920" (UID: "29d85454-a8db-47bc-b616-bbdb4f6d8920"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.972676 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-operator-scripts\") pod \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\" (UID: \"3778f66e-fd7f-4af5-ae3e-2a7c272785a0\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.972722 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkn9b\" (UniqueName: \"kubernetes.io/projected/1946f4fd-5254-4e66-8739-5a51af23e963-kube-api-access-dkn9b\") pod \"1946f4fd-5254-4e66-8739-5a51af23e963\" (UID: \"1946f4fd-5254-4e66-8739-5a51af23e963\") " Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.973017 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3778f66e-fd7f-4af5-ae3e-2a7c272785a0" (UID: "3778f66e-fd7f-4af5-ae3e-2a7c272785a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.973023 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1946f4fd-5254-4e66-8739-5a51af23e963-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1946f4fd-5254-4e66-8739-5a51af23e963" (UID: "1946f4fd-5254-4e66-8739-5a51af23e963"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.973448 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6952c796-d85e-49b3-b931-60966311a0c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.973487 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/282595b2-0eaa-4deb-9af4-288241817325-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.973498 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29d85454-a8db-47bc-b616-bbdb4f6d8920-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.973507 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1946f4fd-5254-4e66-8739-5a51af23e963-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.973515 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.976752 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d85454-a8db-47bc-b616-bbdb4f6d8920-kube-api-access-dqs8n" (OuterVolumeSpecName: "kube-api-access-dqs8n") pod "29d85454-a8db-47bc-b616-bbdb4f6d8920" (UID: "29d85454-a8db-47bc-b616-bbdb4f6d8920"). InnerVolumeSpecName "kube-api-access-dqs8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.977497 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/282595b2-0eaa-4deb-9af4-288241817325-kube-api-access-79mth" (OuterVolumeSpecName: "kube-api-access-79mth") pod "282595b2-0eaa-4deb-9af4-288241817325" (UID: "282595b2-0eaa-4deb-9af4-288241817325"). InnerVolumeSpecName "kube-api-access-79mth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.977774 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-kube-api-access-d4q9l" (OuterVolumeSpecName: "kube-api-access-d4q9l") pod "3778f66e-fd7f-4af5-ae3e-2a7c272785a0" (UID: "3778f66e-fd7f-4af5-ae3e-2a7c272785a0"). InnerVolumeSpecName "kube-api-access-d4q9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.982146 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6952c796-d85e-49b3-b931-60966311a0c0-kube-api-access-jvxgl" (OuterVolumeSpecName: "kube-api-access-jvxgl") pod "6952c796-d85e-49b3-b931-60966311a0c0" (UID: "6952c796-d85e-49b3-b931-60966311a0c0"). InnerVolumeSpecName "kube-api-access-jvxgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.983118 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1946f4fd-5254-4e66-8739-5a51af23e963-kube-api-access-dkn9b" (OuterVolumeSpecName: "kube-api-access-dkn9b") pod "1946f4fd-5254-4e66-8739-5a51af23e963" (UID: "1946f4fd-5254-4e66-8739-5a51af23e963"). InnerVolumeSpecName "kube-api-access-dkn9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.999768 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-h72xz" event={"ID":"29d85454-a8db-47bc-b616-bbdb4f6d8920","Type":"ContainerDied","Data":"af2bb649161fdcaac08f1bb77ef6e41cc4143b0e27292cc1c3c0624b994df5bf"} Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.999807 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af2bb649161fdcaac08f1bb77ef6e41cc4143b0e27292cc1c3c0624b994df5bf" Feb 19 21:47:30 crc kubenswrapper[4795]: I0219 21:47:30.999891 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-h72xz" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.004642 4795 generic.go:334] "Generic (PLEG): container finished" podID="b449064d-5c14-4362-ba7b-a24ee9292789" containerID="1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a" exitCode=143 Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.004753 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b449064d-5c14-4362-ba7b-a24ee9292789","Type":"ContainerDied","Data":"1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a"} Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.006432 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2d62-account-create-update-jrx2c" event={"ID":"282595b2-0eaa-4deb-9af4-288241817325","Type":"ContainerDied","Data":"f5a89b0c123f3a7776b51e9bb99d6cb38a702a838a7a412b9f1def886d685c88"} Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.006520 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5a89b0c123f3a7776b51e9bb99d6cb38a702a838a7a412b9f1def886d685c88" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.006652 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d62-account-create-update-jrx2c" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.014410 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r8v4f" event={"ID":"573a7aa5-43d9-4523-8eea-4c1a36da49fb","Type":"ContainerDied","Data":"b1129ce7375e075c1c1844910d66cfc1b6308074f1d6f73dea4c3d974a9f4054"} Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.014451 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1129ce7375e075c1c1844910d66cfc1b6308074f1d6f73dea4c3d974a9f4054" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.014467 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r8v4f" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.018413 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerID="32a8f97c4bdeeea5fdb4b24c48b5ee8e3dda515976e342ae4939a42ab5261eec" exitCode=0 Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.018487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac1caac2-edf5-453d-a76d-e1c65b7f038b","Type":"ContainerDied","Data":"32a8f97c4bdeeea5fdb4b24c48b5ee8e3dda515976e342ae4939a42ab5261eec"} Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.044753 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e48f-account-create-update-48v7f" event={"ID":"6952c796-d85e-49b3-b931-60966311a0c0","Type":"ContainerDied","Data":"0748dc0aa50d4d39dfbdabcf9ed72680e0e959eb3eb965f01e22779a3df14276"} Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.044786 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0748dc0aa50d4d39dfbdabcf9ed72680e0e959eb3eb965f01e22779a3df14276" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.044838 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e48f-account-create-update-48v7f" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.050943 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerStarted","Data":"1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9"} Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.056441 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.058674 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p7s8n" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.058970 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p7s8n" event={"ID":"3778f66e-fd7f-4af5-ae3e-2a7c272785a0","Type":"ContainerDied","Data":"68df0b7ecf93ce028557646109c36262df3e9ce4a929fd966aed1e0fe5f5daff"} Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.058999 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68df0b7ecf93ce028557646109c36262df3e9ce4a929fd966aed1e0fe5f5daff" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.060914 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1922-account-create-update-bnqt2" event={"ID":"1946f4fd-5254-4e66-8739-5a51af23e963","Type":"ContainerDied","Data":"1c20c0048ca6ea52ec70d130bc06285f1a2ec973f63fcbb08a3874e51e60bfaa"} Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.060951 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c20c0048ca6ea52ec70d130bc06285f1a2ec973f63fcbb08a3874e51e60bfaa" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.061011 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1922-account-create-update-bnqt2" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.075107 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkn9b\" (UniqueName: \"kubernetes.io/projected/1946f4fd-5254-4e66-8739-5a51af23e963-kube-api-access-dkn9b\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.075131 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4q9l\" (UniqueName: \"kubernetes.io/projected/3778f66e-fd7f-4af5-ae3e-2a7c272785a0-kube-api-access-d4q9l\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.075141 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79mth\" (UniqueName: \"kubernetes.io/projected/282595b2-0eaa-4deb-9af4-288241817325-kube-api-access-79mth\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.075150 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqs8n\" (UniqueName: \"kubernetes.io/projected/29d85454-a8db-47bc-b616-bbdb4f6d8920-kube-api-access-dqs8n\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.075158 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvxgl\" (UniqueName: \"kubernetes.io/projected/6952c796-d85e-49b3-b931-60966311a0c0-kube-api-access-jvxgl\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.176605 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-logs\") pod \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.176966 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-public-tls-certs\") pod \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177040 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-config-data\") pod \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177076 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177105 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-combined-ca-bundle\") pod \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177123 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-httpd-run\") pod \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177156 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-scripts\") pod \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177176 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-logs" (OuterVolumeSpecName: "logs") pod "ac1caac2-edf5-453d-a76d-e1c65b7f038b" (UID: "ac1caac2-edf5-453d-a76d-e1c65b7f038b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177207 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz2gw\" (UniqueName: \"kubernetes.io/projected/ac1caac2-edf5-453d-a76d-e1c65b7f038b-kube-api-access-hz2gw\") pod \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\" (UID: \"ac1caac2-edf5-453d-a76d-e1c65b7f038b\") " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177359 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ac1caac2-edf5-453d-a76d-e1c65b7f038b" (UID: "ac1caac2-edf5-453d-a76d-e1c65b7f038b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.177628 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.182128 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-scripts" (OuterVolumeSpecName: "scripts") pod "ac1caac2-edf5-453d-a76d-e1c65b7f038b" (UID: "ac1caac2-edf5-453d-a76d-e1c65b7f038b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.185302 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac1caac2-edf5-453d-a76d-e1c65b7f038b-kube-api-access-hz2gw" (OuterVolumeSpecName: "kube-api-access-hz2gw") pod "ac1caac2-edf5-453d-a76d-e1c65b7f038b" (UID: "ac1caac2-edf5-453d-a76d-e1c65b7f038b"). InnerVolumeSpecName "kube-api-access-hz2gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.187347 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "ac1caac2-edf5-453d-a76d-e1c65b7f038b" (UID: "ac1caac2-edf5-453d-a76d-e1c65b7f038b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.227238 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac1caac2-edf5-453d-a76d-e1c65b7f038b" (UID: "ac1caac2-edf5-453d-a76d-e1c65b7f038b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.241154 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-config-data" (OuterVolumeSpecName: "config-data") pod "ac1caac2-edf5-453d-a76d-e1c65b7f038b" (UID: "ac1caac2-edf5-453d-a76d-e1c65b7f038b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.247387 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ac1caac2-edf5-453d-a76d-e1c65b7f038b" (UID: "ac1caac2-edf5-453d-a76d-e1c65b7f038b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.281790 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac1caac2-edf5-453d-a76d-e1c65b7f038b-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.281839 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.281854 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.281901 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.281914 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.281924 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac1caac2-edf5-453d-a76d-e1c65b7f038b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.281933 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz2gw\" (UniqueName: \"kubernetes.io/projected/ac1caac2-edf5-453d-a76d-e1c65b7f038b-kube-api-access-hz2gw\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.306926 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.383366 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:31 crc kubenswrapper[4795]: I0219 21:47:31.603772 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.071080 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ac1caac2-edf5-453d-a76d-e1c65b7f038b","Type":"ContainerDied","Data":"3b74237da4db43dc7c546ec4709a7afdab3dbcd047ed3e21f44f8f9ee5a66753"} Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.071120 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.071137 4795 scope.go:117] "RemoveContainer" containerID="32a8f97c4bdeeea5fdb4b24c48b5ee8e3dda515976e342ae4939a42ab5261eec" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.075845 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerStarted","Data":"bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972"} Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.096627 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.108055 4795 scope.go:117] "RemoveContainer" containerID="7826593332e6fe0d625ec77b566a78383702574fe609c8ca89088f745857f981" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.109760 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.122768 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:47:32 crc kubenswrapper[4795]: E0219 21:47:32.123332 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-httpd" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123356 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-httpd" Feb 19 21:47:32 crc kubenswrapper[4795]: E0219 21:47:32.123377 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="282595b2-0eaa-4deb-9af4-288241817325" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123386 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="282595b2-0eaa-4deb-9af4-288241817325" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: E0219 21:47:32.123395 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="573a7aa5-43d9-4523-8eea-4c1a36da49fb" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123403 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="573a7aa5-43d9-4523-8eea-4c1a36da49fb" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: E0219 21:47:32.123417 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d85454-a8db-47bc-b616-bbdb4f6d8920" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123424 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d85454-a8db-47bc-b616-bbdb4f6d8920" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: E0219 21:47:32.123440 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6952c796-d85e-49b3-b931-60966311a0c0" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123448 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6952c796-d85e-49b3-b931-60966311a0c0" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: E0219 21:47:32.123461 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1946f4fd-5254-4e66-8739-5a51af23e963" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123468 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1946f4fd-5254-4e66-8739-5a51af23e963" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: E0219 21:47:32.123488 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-log" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123496 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-log" Feb 19 21:47:32 crc kubenswrapper[4795]: E0219 21:47:32.123505 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3778f66e-fd7f-4af5-ae3e-2a7c272785a0" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123513 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3778f66e-fd7f-4af5-ae3e-2a7c272785a0" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123726 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="29d85454-a8db-47bc-b616-bbdb4f6d8920" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123750 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1946f4fd-5254-4e66-8739-5a51af23e963" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123767 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3778f66e-fd7f-4af5-ae3e-2a7c272785a0" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123779 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="573a7aa5-43d9-4523-8eea-4c1a36da49fb" containerName="mariadb-database-create" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123794 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="282595b2-0eaa-4deb-9af4-288241817325" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123807 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-log" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123816 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" containerName="glance-httpd" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.123830 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6952c796-d85e-49b3-b931-60966311a0c0" containerName="mariadb-account-create-update" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.125140 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.127398 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.127627 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.149054 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.298222 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-scripts\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.298279 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-config-data\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.298304 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.298447 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-logs\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.298658 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8fsc\" (UniqueName: \"kubernetes.io/projected/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-kube-api-access-x8fsc\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.298809 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.298883 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.299046 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.400511 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.400580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-scripts\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.400614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-config-data\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.400634 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.400661 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-logs\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.400705 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8fsc\" (UniqueName: \"kubernetes.io/projected/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-kube-api-access-x8fsc\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.400742 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.400765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.401355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.401783 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.402688 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-logs\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.409743 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-config-data\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.410793 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.410986 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-scripts\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.411265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.417942 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8fsc\" (UniqueName: \"kubernetes.io/projected/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-kube-api-access-x8fsc\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.427889 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " pod="openstack/glance-default-external-api-0" Feb 19 21:47:32 crc kubenswrapper[4795]: I0219 21:47:32.457400 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.085915 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerStarted","Data":"78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa"} Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.086335 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.086246 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="proxy-httpd" containerID="cri-o://78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa" gracePeriod=30 Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.086114 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="ceilometer-central-agent" containerID="cri-o://92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5" gracePeriod=30 Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.086287 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="ceilometer-notification-agent" containerID="cri-o://1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9" gracePeriod=30 Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.086295 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="sg-core" containerID="cri-o://bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972" gracePeriod=30 Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.110241 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.338536036 podStartE2EDuration="6.110227328s" podCreationTimestamp="2026-02-19 21:47:27 +0000 UTC" firstStartedPulling="2026-02-19 21:47:28.779058868 +0000 UTC m=+1159.971576722" lastFinishedPulling="2026-02-19 21:47:32.55075015 +0000 UTC m=+1163.743268014" observedRunningTime="2026-02-19 21:47:33.109452157 +0000 UTC m=+1164.301970021" watchObservedRunningTime="2026-02-19 21:47:33.110227328 +0000 UTC m=+1164.302745192" Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.180836 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.520046 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac1caac2-edf5-453d-a76d-e1c65b7f038b" path="/var/lib/kubelet/pods/ac1caac2-edf5-453d-a76d-e1c65b7f038b/volumes" Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.556354 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:60368->10.217.0.149:9292: read: connection reset by peer" Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.556543 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:60376->10.217.0.149:9292: read: connection reset by peer" Feb 19 21:47:33 crc kubenswrapper[4795]: I0219 21:47:33.984107 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.106887 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9","Type":"ContainerStarted","Data":"46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f"} Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.107405 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9","Type":"ContainerStarted","Data":"6d8d28f68ae7a05b3b24448d485df065e39bc0509b04817346db5c0af58598b8"} Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.110797 4795 generic.go:334] "Generic (PLEG): container finished" podID="b449064d-5c14-4362-ba7b-a24ee9292789" containerID="24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d" exitCode=0 Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.110875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b449064d-5c14-4362-ba7b-a24ee9292789","Type":"ContainerDied","Data":"24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d"} Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.110895 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b449064d-5c14-4362-ba7b-a24ee9292789","Type":"ContainerDied","Data":"fe3c35dfb7e24d88f06d64c3416cafe5c8ebe7a75022634f77450664da8f2158"} Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.110914 4795 scope.go:117] "RemoveContainer" containerID="24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.110931 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.126499 4795 generic.go:334] "Generic (PLEG): container finished" podID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerID="78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa" exitCode=0 Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.126717 4795 generic.go:334] "Generic (PLEG): container finished" podID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerID="bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972" exitCode=2 Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.126806 4795 generic.go:334] "Generic (PLEG): container finished" podID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerID="1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9" exitCode=0 Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.126875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerDied","Data":"78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa"} Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.126955 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerDied","Data":"bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972"} Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.127016 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerDied","Data":"1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9"} Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.139614 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-httpd-run\") pod \"b449064d-5c14-4362-ba7b-a24ee9292789\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.139658 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-logs\") pod \"b449064d-5c14-4362-ba7b-a24ee9292789\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.139685 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-config-data\") pod \"b449064d-5c14-4362-ba7b-a24ee9292789\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.139748 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zp5z\" (UniqueName: \"kubernetes.io/projected/b449064d-5c14-4362-ba7b-a24ee9292789-kube-api-access-7zp5z\") pod \"b449064d-5c14-4362-ba7b-a24ee9292789\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.139781 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"b449064d-5c14-4362-ba7b-a24ee9292789\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.139808 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-internal-tls-certs\") pod \"b449064d-5c14-4362-ba7b-a24ee9292789\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.139834 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-scripts\") pod \"b449064d-5c14-4362-ba7b-a24ee9292789\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.139915 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-combined-ca-bundle\") pod \"b449064d-5c14-4362-ba7b-a24ee9292789\" (UID: \"b449064d-5c14-4362-ba7b-a24ee9292789\") " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.144233 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-logs" (OuterVolumeSpecName: "logs") pod "b449064d-5c14-4362-ba7b-a24ee9292789" (UID: "b449064d-5c14-4362-ba7b-a24ee9292789"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.144529 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b449064d-5c14-4362-ba7b-a24ee9292789" (UID: "b449064d-5c14-4362-ba7b-a24ee9292789"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.145806 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b449064d-5c14-4362-ba7b-a24ee9292789-kube-api-access-7zp5z" (OuterVolumeSpecName: "kube-api-access-7zp5z") pod "b449064d-5c14-4362-ba7b-a24ee9292789" (UID: "b449064d-5c14-4362-ba7b-a24ee9292789"). InnerVolumeSpecName "kube-api-access-7zp5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.149087 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "b449064d-5c14-4362-ba7b-a24ee9292789" (UID: "b449064d-5c14-4362-ba7b-a24ee9292789"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.149288 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-scripts" (OuterVolumeSpecName: "scripts") pod "b449064d-5c14-4362-ba7b-a24ee9292789" (UID: "b449064d-5c14-4362-ba7b-a24ee9292789"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.160463 4795 scope.go:117] "RemoveContainer" containerID="1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.171346 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b449064d-5c14-4362-ba7b-a24ee9292789" (UID: "b449064d-5c14-4362-ba7b-a24ee9292789"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.184098 4795 scope.go:117] "RemoveContainer" containerID="24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d" Feb 19 21:47:34 crc kubenswrapper[4795]: E0219 21:47:34.184596 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d\": container with ID starting with 24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d not found: ID does not exist" containerID="24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.184644 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d"} err="failed to get container status \"24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d\": rpc error: code = NotFound desc = could not find container \"24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d\": container with ID starting with 24b4c0022da799cfde408a58dfc57a52b4277c9601d8bb59da3e5c565074603d not found: ID does not exist" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.184669 4795 scope.go:117] "RemoveContainer" containerID="1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a" Feb 19 21:47:34 crc kubenswrapper[4795]: E0219 21:47:34.185018 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a\": container with ID starting with 1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a not found: ID does not exist" containerID="1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.185041 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a"} err="failed to get container status \"1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a\": rpc error: code = NotFound desc = could not find container \"1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a\": container with ID starting with 1cce505138da629126ee8b9c2d975891bbe03cb0751f01b86b7395429a4b3b1a not found: ID does not exist" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.195543 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b449064d-5c14-4362-ba7b-a24ee9292789" (UID: "b449064d-5c14-4362-ba7b-a24ee9292789"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.197968 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-config-data" (OuterVolumeSpecName: "config-data") pod "b449064d-5c14-4362-ba7b-a24ee9292789" (UID: "b449064d-5c14-4362-ba7b-a24ee9292789"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.242543 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.242583 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.242597 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b449064d-5c14-4362-ba7b-a24ee9292789-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.242609 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.242622 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zp5z\" (UniqueName: \"kubernetes.io/projected/b449064d-5c14-4362-ba7b-a24ee9292789-kube-api-access-7zp5z\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.242659 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.242672 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.242692 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b449064d-5c14-4362-ba7b-a24ee9292789-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.268503 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.344315 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.489529 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.500733 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.513891 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:47:34 crc kubenswrapper[4795]: E0219 21:47:34.514328 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-log" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.514351 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-log" Feb 19 21:47:34 crc kubenswrapper[4795]: E0219 21:47:34.514366 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-httpd" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.514404 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-httpd" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.514632 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-httpd" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.516033 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" containerName="glance-log" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.517689 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.519801 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.520248 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.529493 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.649127 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.649433 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.649506 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-logs\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.649554 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.649598 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.649635 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.649654 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.649692 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg5wb\" (UniqueName: \"kubernetes.io/projected/3697a3b0-4077-4837-bcdc-c17d8aa361f1-kube-api-access-bg5wb\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.751653 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.751723 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.751741 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.751778 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg5wb\" (UniqueName: \"kubernetes.io/projected/3697a3b0-4077-4837-bcdc-c17d8aa361f1-kube-api-access-bg5wb\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.751805 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.751833 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.751862 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-logs\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.751909 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.753248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.753340 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.753508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-logs\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.758095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.758781 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.758815 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.759321 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.771831 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg5wb\" (UniqueName: \"kubernetes.io/projected/3697a3b0-4077-4837-bcdc-c17d8aa361f1-kube-api-access-bg5wb\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.777500 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " pod="openstack/glance-default-internal-api-0" Feb 19 21:47:34 crc kubenswrapper[4795]: I0219 21:47:34.836246 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:35 crc kubenswrapper[4795]: I0219 21:47:35.148672 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9","Type":"ContainerStarted","Data":"708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217"} Feb 19 21:47:35 crc kubenswrapper[4795]: I0219 21:47:35.175423 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.175403891 podStartE2EDuration="3.175403891s" podCreationTimestamp="2026-02-19 21:47:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:47:35.171852721 +0000 UTC m=+1166.364370595" watchObservedRunningTime="2026-02-19 21:47:35.175403891 +0000 UTC m=+1166.367921745" Feb 19 21:47:35 crc kubenswrapper[4795]: W0219 21:47:35.420708 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3697a3b0_4077_4837_bcdc_c17d8aa361f1.slice/crio-4b0f0860894e220641b68db9b622d33a66b09008a18bfa19149efafa413199c3 WatchSource:0}: Error finding container 4b0f0860894e220641b68db9b622d33a66b09008a18bfa19149efafa413199c3: Status 404 returned error can't find the container with id 4b0f0860894e220641b68db9b622d33a66b09008a18bfa19149efafa413199c3 Feb 19 21:47:35 crc kubenswrapper[4795]: I0219 21:47:35.422445 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:47:35 crc kubenswrapper[4795]: I0219 21:47:35.525251 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b449064d-5c14-4362-ba7b-a24ee9292789" path="/var/lib/kubelet/pods/b449064d-5c14-4362-ba7b-a24ee9292789/volumes" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.160615 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3697a3b0-4077-4837-bcdc-c17d8aa361f1","Type":"ContainerStarted","Data":"e797be576a87ea7d2cd1a10d4fb93c6e0f25a6a5bebf1abb85c7b6e12aa13e38"} Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.160965 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3697a3b0-4077-4837-bcdc-c17d8aa361f1","Type":"ContainerStarted","Data":"4b0f0860894e220641b68db9b622d33a66b09008a18bfa19149efafa413199c3"} Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.692109 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7ssdv"] Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.693153 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.696480 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.696586 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.697851 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-b4vdh" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.706101 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7ssdv"] Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.785873 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-scripts\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.786075 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-config-data\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.786262 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.786351 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhcgt\" (UniqueName: \"kubernetes.io/projected/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-kube-api-access-zhcgt\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.888140 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-scripts\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.888245 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-config-data\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.888289 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.888317 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhcgt\" (UniqueName: \"kubernetes.io/projected/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-kube-api-access-zhcgt\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.894990 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.895083 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-config-data\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.902815 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhcgt\" (UniqueName: \"kubernetes.io/projected/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-kube-api-access-zhcgt\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:36 crc kubenswrapper[4795]: I0219 21:47:36.904558 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-scripts\") pod \"nova-cell0-conductor-db-sync-7ssdv\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:37 crc kubenswrapper[4795]: I0219 21:47:37.007962 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:37 crc kubenswrapper[4795]: I0219 21:47:37.176274 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3697a3b0-4077-4837-bcdc-c17d8aa361f1","Type":"ContainerStarted","Data":"c6abf78f9f811ce98af3de204165d6af86666923e425da520b7a47fdc3944ee7"} Feb 19 21:47:37 crc kubenswrapper[4795]: I0219 21:47:37.202720 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.202699623 podStartE2EDuration="3.202699623s" podCreationTimestamp="2026-02-19 21:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:47:37.200304456 +0000 UTC m=+1168.392822350" watchObservedRunningTime="2026-02-19 21:47:37.202699623 +0000 UTC m=+1168.395217497" Feb 19 21:47:37 crc kubenswrapper[4795]: I0219 21:47:37.526747 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7ssdv"] Feb 19 21:47:37 crc kubenswrapper[4795]: W0219 21:47:37.547450 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef98a0b8_d6d9_4075_ae60_e7d614a79e7f.slice/crio-0d7e8c41b27cf2ce2231d19e47302fd30c5de4373894fc72062d69152981c2ae WatchSource:0}: Error finding container 0d7e8c41b27cf2ce2231d19e47302fd30c5de4373894fc72062d69152981c2ae: Status 404 returned error can't find the container with id 0d7e8c41b27cf2ce2231d19e47302fd30c5de4373894fc72062d69152981c2ae Feb 19 21:47:38 crc kubenswrapper[4795]: I0219 21:47:38.188328 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" event={"ID":"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f","Type":"ContainerStarted","Data":"0d7e8c41b27cf2ce2231d19e47302fd30c5de4373894fc72062d69152981c2ae"} Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.653235 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.743307 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-config-data\") pod \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.743356 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-run-httpd\") pod \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.743422 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-combined-ca-bundle\") pod \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.743447 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqzt6\" (UniqueName: \"kubernetes.io/projected/253d2f67-fdba-4a38-9b30-8544e6e54cc4-kube-api-access-hqzt6\") pod \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.743481 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-log-httpd\") pod \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.743537 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-scripts\") pod \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.743559 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-sg-core-conf-yaml\") pod \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\" (UID: \"253d2f67-fdba-4a38-9b30-8544e6e54cc4\") " Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.744023 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "253d2f67-fdba-4a38-9b30-8544e6e54cc4" (UID: "253d2f67-fdba-4a38-9b30-8544e6e54cc4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.744222 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "253d2f67-fdba-4a38-9b30-8544e6e54cc4" (UID: "253d2f67-fdba-4a38-9b30-8544e6e54cc4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.750443 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/253d2f67-fdba-4a38-9b30-8544e6e54cc4-kube-api-access-hqzt6" (OuterVolumeSpecName: "kube-api-access-hqzt6") pod "253d2f67-fdba-4a38-9b30-8544e6e54cc4" (UID: "253d2f67-fdba-4a38-9b30-8544e6e54cc4"). InnerVolumeSpecName "kube-api-access-hqzt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.751322 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-scripts" (OuterVolumeSpecName: "scripts") pod "253d2f67-fdba-4a38-9b30-8544e6e54cc4" (UID: "253d2f67-fdba-4a38-9b30-8544e6e54cc4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.776584 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "253d2f67-fdba-4a38-9b30-8544e6e54cc4" (UID: "253d2f67-fdba-4a38-9b30-8544e6e54cc4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.809628 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "253d2f67-fdba-4a38-9b30-8544e6e54cc4" (UID: "253d2f67-fdba-4a38-9b30-8544e6e54cc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.846363 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.846392 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.846403 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqzt6\" (UniqueName: \"kubernetes.io/projected/253d2f67-fdba-4a38-9b30-8544e6e54cc4-kube-api-access-hqzt6\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.846417 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/253d2f67-fdba-4a38-9b30-8544e6e54cc4-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.846425 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.846433 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.848878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-config-data" (OuterVolumeSpecName: "config-data") pod "253d2f67-fdba-4a38-9b30-8544e6e54cc4" (UID: "253d2f67-fdba-4a38-9b30-8544e6e54cc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:39 crc kubenswrapper[4795]: I0219 21:47:39.950204 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253d2f67-fdba-4a38-9b30-8544e6e54cc4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.209408 4795 generic.go:334] "Generic (PLEG): container finished" podID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerID="92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5" exitCode=0 Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.209470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerDied","Data":"92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5"} Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.209494 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"253d2f67-fdba-4a38-9b30-8544e6e54cc4","Type":"ContainerDied","Data":"84b0bcbc0062b2eec5eb90cbad2c6d5b12462c44819f0b7f936e0b7ddb57186c"} Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.209510 4795 scope.go:117] "RemoveContainer" containerID="78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.209666 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.243945 4795 scope.go:117] "RemoveContainer" containerID="bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.244098 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.268607 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.269516 4795 scope.go:117] "RemoveContainer" containerID="1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.279592 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:40 crc kubenswrapper[4795]: E0219 21:47:40.280030 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="proxy-httpd" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.280051 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="proxy-httpd" Feb 19 21:47:40 crc kubenswrapper[4795]: E0219 21:47:40.280081 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="ceilometer-notification-agent" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.280090 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="ceilometer-notification-agent" Feb 19 21:47:40 crc kubenswrapper[4795]: E0219 21:47:40.280122 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="sg-core" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.280130 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="sg-core" Feb 19 21:47:40 crc kubenswrapper[4795]: E0219 21:47:40.280140 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="ceilometer-central-agent" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.280147 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="ceilometer-central-agent" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.280387 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="ceilometer-notification-agent" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.280405 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="proxy-httpd" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.280426 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="sg-core" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.280441 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" containerName="ceilometer-central-agent" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.281915 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.284036 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.284217 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.303709 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.311758 4795 scope.go:117] "RemoveContainer" containerID="92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.348081 4795 scope.go:117] "RemoveContainer" containerID="78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa" Feb 19 21:47:40 crc kubenswrapper[4795]: E0219 21:47:40.348552 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa\": container with ID starting with 78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa not found: ID does not exist" containerID="78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.348580 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa"} err="failed to get container status \"78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa\": rpc error: code = NotFound desc = could not find container \"78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa\": container with ID starting with 78614fb581b9ca38ffccfdb9316d7293579f666e082d3786d4b25f75d92eddaa not found: ID does not exist" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.348605 4795 scope.go:117] "RemoveContainer" containerID="bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972" Feb 19 21:47:40 crc kubenswrapper[4795]: E0219 21:47:40.348853 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972\": container with ID starting with bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972 not found: ID does not exist" containerID="bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.348872 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972"} err="failed to get container status \"bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972\": rpc error: code = NotFound desc = could not find container \"bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972\": container with ID starting with bf60e2985b3070dfff9d45f1b36e7c5e63972e743b6bc5b2679f84fa20f46972 not found: ID does not exist" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.348887 4795 scope.go:117] "RemoveContainer" containerID="1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9" Feb 19 21:47:40 crc kubenswrapper[4795]: E0219 21:47:40.349122 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9\": container with ID starting with 1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9 not found: ID does not exist" containerID="1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.349144 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9"} err="failed to get container status \"1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9\": rpc error: code = NotFound desc = could not find container \"1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9\": container with ID starting with 1d73a882957bce93c029299c95b3bfddc0c0d2a2db28a352e94f0a5c556118f9 not found: ID does not exist" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.349159 4795 scope.go:117] "RemoveContainer" containerID="92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5" Feb 19 21:47:40 crc kubenswrapper[4795]: E0219 21:47:40.350193 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5\": container with ID starting with 92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5 not found: ID does not exist" containerID="92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.350225 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5"} err="failed to get container status \"92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5\": rpc error: code = NotFound desc = could not find container \"92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5\": container with ID starting with 92a2e0a3dc40a655fd29047a14680db6caca8d95bdb8b99de07de6f7f007aee5 not found: ID does not exist" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.356832 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.356891 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.356958 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-scripts\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.356985 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-config-data\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.357006 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-log-httpd\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.357045 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbrc4\" (UniqueName: \"kubernetes.io/projected/c107007e-46bb-4d36-a899-18b499685b6c-kube-api-access-bbrc4\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.357091 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-run-httpd\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.458694 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-log-httpd\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.458757 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbrc4\" (UniqueName: \"kubernetes.io/projected/c107007e-46bb-4d36-a899-18b499685b6c-kube-api-access-bbrc4\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.458806 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-run-httpd\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.458864 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.458897 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.458943 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-scripts\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.458964 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-config-data\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.459269 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-log-httpd\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.462934 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-scripts\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.463074 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-run-httpd\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.463701 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.464814 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-config-data\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.468592 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.478878 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbrc4\" (UniqueName: \"kubernetes.io/projected/c107007e-46bb-4d36-a899-18b499685b6c-kube-api-access-bbrc4\") pod \"ceilometer-0\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " pod="openstack/ceilometer-0" Feb 19 21:47:40 crc kubenswrapper[4795]: I0219 21:47:40.606063 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:41 crc kubenswrapper[4795]: I0219 21:47:41.090924 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:41 crc kubenswrapper[4795]: I0219 21:47:41.525885 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="253d2f67-fdba-4a38-9b30-8544e6e54cc4" path="/var/lib/kubelet/pods/253d2f67-fdba-4a38-9b30-8544e6e54cc4/volumes" Feb 19 21:47:42 crc kubenswrapper[4795]: I0219 21:47:42.458970 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 21:47:42 crc kubenswrapper[4795]: I0219 21:47:42.459009 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 21:47:42 crc kubenswrapper[4795]: I0219 21:47:42.507814 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 21:47:42 crc kubenswrapper[4795]: I0219 21:47:42.508135 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 21:47:43 crc kubenswrapper[4795]: I0219 21:47:43.239098 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 21:47:43 crc kubenswrapper[4795]: I0219 21:47:43.239158 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 21:47:44 crc kubenswrapper[4795]: I0219 21:47:44.836883 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:44 crc kubenswrapper[4795]: I0219 21:47:44.837266 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:44 crc kubenswrapper[4795]: I0219 21:47:44.871744 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:44 crc kubenswrapper[4795]: I0219 21:47:44.884593 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:45 crc kubenswrapper[4795]: I0219 21:47:45.101514 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 21:47:45 crc kubenswrapper[4795]: I0219 21:47:45.105281 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 21:47:45 crc kubenswrapper[4795]: I0219 21:47:45.260432 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:45 crc kubenswrapper[4795]: I0219 21:47:45.260748 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:45 crc kubenswrapper[4795]: W0219 21:47:45.560285 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc107007e_46bb_4d36_a899_18b499685b6c.slice/crio-8070c1271994839952f323f138b6727b9ea760bae63f0a2d0ecba838758ac966 WatchSource:0}: Error finding container 8070c1271994839952f323f138b6727b9ea760bae63f0a2d0ecba838758ac966: Status 404 returned error can't find the container with id 8070c1271994839952f323f138b6727b9ea760bae63f0a2d0ecba838758ac966 Feb 19 21:47:46 crc kubenswrapper[4795]: I0219 21:47:46.300925 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerStarted","Data":"8070c1271994839952f323f138b6727b9ea760bae63f0a2d0ecba838758ac966"} Feb 19 21:47:47 crc kubenswrapper[4795]: I0219 21:47:47.308050 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" event={"ID":"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f","Type":"ContainerStarted","Data":"820a0240f9371635d4e5f03ad2ffcfa48ce070182eb86809479efd07b7626507"} Feb 19 21:47:47 crc kubenswrapper[4795]: I0219 21:47:47.310899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerStarted","Data":"96afc7d12e4b4065345a42092f89f7771e25e07a46baa0fea369d89d2158ed38"} Feb 19 21:47:47 crc kubenswrapper[4795]: I0219 21:47:47.349699 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" podStartSLOduration=2.856475896 podStartE2EDuration="11.349681604s" podCreationTimestamp="2026-02-19 21:47:36 +0000 UTC" firstStartedPulling="2026-02-19 21:47:37.549568268 +0000 UTC m=+1168.742086142" lastFinishedPulling="2026-02-19 21:47:46.042773986 +0000 UTC m=+1177.235291850" observedRunningTime="2026-02-19 21:47:47.341675699 +0000 UTC m=+1178.534193583" watchObservedRunningTime="2026-02-19 21:47:47.349681604 +0000 UTC m=+1178.542199478" Feb 19 21:47:47 crc kubenswrapper[4795]: I0219 21:47:47.416657 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:47 crc kubenswrapper[4795]: I0219 21:47:47.416877 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 21:47:47 crc kubenswrapper[4795]: I0219 21:47:47.471499 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 21:47:48 crc kubenswrapper[4795]: I0219 21:47:48.320941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerStarted","Data":"23c4cdb855c62f25668dc60ba49d8d76113597549e62c5c0021a4b816d8cbb51"} Feb 19 21:47:48 crc kubenswrapper[4795]: I0219 21:47:48.321422 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerStarted","Data":"7bfc7fbdbeaba8f96d2d0eeb4f40c479fdb828069040dd1babf326b03913a7ff"} Feb 19 21:47:50 crc kubenswrapper[4795]: I0219 21:47:50.339452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerStarted","Data":"6ea888400454dbf863a80de3a0bfbd0922a19e27a8915bf3832e7882097078a1"} Feb 19 21:47:50 crc kubenswrapper[4795]: I0219 21:47:50.340271 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:47:50 crc kubenswrapper[4795]: I0219 21:47:50.376638 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.989882565 podStartE2EDuration="10.376618566s" podCreationTimestamp="2026-02-19 21:47:40 +0000 UTC" firstStartedPulling="2026-02-19 21:47:45.944479394 +0000 UTC m=+1177.136997258" lastFinishedPulling="2026-02-19 21:47:49.331215395 +0000 UTC m=+1180.523733259" observedRunningTime="2026-02-19 21:47:50.365386661 +0000 UTC m=+1181.557904525" watchObservedRunningTime="2026-02-19 21:47:50.376618566 +0000 UTC m=+1181.569136430" Feb 19 21:47:52 crc kubenswrapper[4795]: I0219 21:47:52.420639 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:52 crc kubenswrapper[4795]: I0219 21:47:52.421483 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="proxy-httpd" containerID="cri-o://6ea888400454dbf863a80de3a0bfbd0922a19e27a8915bf3832e7882097078a1" gracePeriod=30 Feb 19 21:47:52 crc kubenswrapper[4795]: I0219 21:47:52.421621 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="ceilometer-central-agent" containerID="cri-o://96afc7d12e4b4065345a42092f89f7771e25e07a46baa0fea369d89d2158ed38" gracePeriod=30 Feb 19 21:47:52 crc kubenswrapper[4795]: I0219 21:47:52.421599 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="ceilometer-notification-agent" containerID="cri-o://7bfc7fbdbeaba8f96d2d0eeb4f40c479fdb828069040dd1babf326b03913a7ff" gracePeriod=30 Feb 19 21:47:52 crc kubenswrapper[4795]: I0219 21:47:52.421790 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="sg-core" containerID="cri-o://23c4cdb855c62f25668dc60ba49d8d76113597549e62c5c0021a4b816d8cbb51" gracePeriod=30 Feb 19 21:47:53 crc kubenswrapper[4795]: I0219 21:47:53.362056 4795 generic.go:334] "Generic (PLEG): container finished" podID="c107007e-46bb-4d36-a899-18b499685b6c" containerID="6ea888400454dbf863a80de3a0bfbd0922a19e27a8915bf3832e7882097078a1" exitCode=0 Feb 19 21:47:53 crc kubenswrapper[4795]: I0219 21:47:53.362396 4795 generic.go:334] "Generic (PLEG): container finished" podID="c107007e-46bb-4d36-a899-18b499685b6c" containerID="23c4cdb855c62f25668dc60ba49d8d76113597549e62c5c0021a4b816d8cbb51" exitCode=2 Feb 19 21:47:53 crc kubenswrapper[4795]: I0219 21:47:53.362405 4795 generic.go:334] "Generic (PLEG): container finished" podID="c107007e-46bb-4d36-a899-18b499685b6c" containerID="7bfc7fbdbeaba8f96d2d0eeb4f40c479fdb828069040dd1babf326b03913a7ff" exitCode=0 Feb 19 21:47:53 crc kubenswrapper[4795]: I0219 21:47:53.362425 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerDied","Data":"6ea888400454dbf863a80de3a0bfbd0922a19e27a8915bf3832e7882097078a1"} Feb 19 21:47:53 crc kubenswrapper[4795]: I0219 21:47:53.362452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerDied","Data":"23c4cdb855c62f25668dc60ba49d8d76113597549e62c5c0021a4b816d8cbb51"} Feb 19 21:47:53 crc kubenswrapper[4795]: I0219 21:47:53.362462 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerDied","Data":"7bfc7fbdbeaba8f96d2d0eeb4f40c479fdb828069040dd1babf326b03913a7ff"} Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.373433 4795 generic.go:334] "Generic (PLEG): container finished" podID="c107007e-46bb-4d36-a899-18b499685b6c" containerID="96afc7d12e4b4065345a42092f89f7771e25e07a46baa0fea369d89d2158ed38" exitCode=0 Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.373526 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerDied","Data":"96afc7d12e4b4065345a42092f89f7771e25e07a46baa0fea369d89d2158ed38"} Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.484835 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.624274 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-combined-ca-bundle\") pod \"c107007e-46bb-4d36-a899-18b499685b6c\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.624472 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-sg-core-conf-yaml\") pod \"c107007e-46bb-4d36-a899-18b499685b6c\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.624511 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbrc4\" (UniqueName: \"kubernetes.io/projected/c107007e-46bb-4d36-a899-18b499685b6c-kube-api-access-bbrc4\") pod \"c107007e-46bb-4d36-a899-18b499685b6c\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.624602 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-log-httpd\") pod \"c107007e-46bb-4d36-a899-18b499685b6c\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.624631 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-scripts\") pod \"c107007e-46bb-4d36-a899-18b499685b6c\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.624653 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-config-data\") pod \"c107007e-46bb-4d36-a899-18b499685b6c\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.624709 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-run-httpd\") pod \"c107007e-46bb-4d36-a899-18b499685b6c\" (UID: \"c107007e-46bb-4d36-a899-18b499685b6c\") " Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.625288 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c107007e-46bb-4d36-a899-18b499685b6c" (UID: "c107007e-46bb-4d36-a899-18b499685b6c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.625515 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c107007e-46bb-4d36-a899-18b499685b6c" (UID: "c107007e-46bb-4d36-a899-18b499685b6c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.626045 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.626068 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c107007e-46bb-4d36-a899-18b499685b6c-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.630335 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-scripts" (OuterVolumeSpecName: "scripts") pod "c107007e-46bb-4d36-a899-18b499685b6c" (UID: "c107007e-46bb-4d36-a899-18b499685b6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.635455 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c107007e-46bb-4d36-a899-18b499685b6c-kube-api-access-bbrc4" (OuterVolumeSpecName: "kube-api-access-bbrc4") pod "c107007e-46bb-4d36-a899-18b499685b6c" (UID: "c107007e-46bb-4d36-a899-18b499685b6c"). InnerVolumeSpecName "kube-api-access-bbrc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.658534 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c107007e-46bb-4d36-a899-18b499685b6c" (UID: "c107007e-46bb-4d36-a899-18b499685b6c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.704214 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c107007e-46bb-4d36-a899-18b499685b6c" (UID: "c107007e-46bb-4d36-a899-18b499685b6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.719892 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-config-data" (OuterVolumeSpecName: "config-data") pod "c107007e-46bb-4d36-a899-18b499685b6c" (UID: "c107007e-46bb-4d36-a899-18b499685b6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.727975 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.727996 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.728006 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.728017 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c107007e-46bb-4d36-a899-18b499685b6c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:54 crc kubenswrapper[4795]: I0219 21:47:54.728025 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbrc4\" (UniqueName: \"kubernetes.io/projected/c107007e-46bb-4d36-a899-18b499685b6c-kube-api-access-bbrc4\") on node \"crc\" DevicePath \"\"" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.385780 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c107007e-46bb-4d36-a899-18b499685b6c","Type":"ContainerDied","Data":"8070c1271994839952f323f138b6727b9ea760bae63f0a2d0ecba838758ac966"} Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.385835 4795 scope.go:117] "RemoveContainer" containerID="6ea888400454dbf863a80de3a0bfbd0922a19e27a8915bf3832e7882097078a1" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.385871 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.408855 4795 scope.go:117] "RemoveContainer" containerID="23c4cdb855c62f25668dc60ba49d8d76113597549e62c5c0021a4b816d8cbb51" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.434998 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.442863 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.457514 4795 scope.go:117] "RemoveContainer" containerID="7bfc7fbdbeaba8f96d2d0eeb4f40c479fdb828069040dd1babf326b03913a7ff" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.472515 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:55 crc kubenswrapper[4795]: E0219 21:47:55.477287 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="sg-core" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.477337 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="sg-core" Feb 19 21:47:55 crc kubenswrapper[4795]: E0219 21:47:55.477355 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="proxy-httpd" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.477363 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="proxy-httpd" Feb 19 21:47:55 crc kubenswrapper[4795]: E0219 21:47:55.477391 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="ceilometer-notification-agent" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.477399 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="ceilometer-notification-agent" Feb 19 21:47:55 crc kubenswrapper[4795]: E0219 21:47:55.477439 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="ceilometer-central-agent" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.477447 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="ceilometer-central-agent" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.477767 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="ceilometer-notification-agent" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.477795 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="proxy-httpd" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.477820 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="ceilometer-central-agent" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.477834 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c107007e-46bb-4d36-a899-18b499685b6c" containerName="sg-core" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.479922 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.482569 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.482774 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.482964 4795 scope.go:117] "RemoveContainer" containerID="96afc7d12e4b4065345a42092f89f7771e25e07a46baa0fea369d89d2158ed38" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.484510 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.537399 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c107007e-46bb-4d36-a899-18b499685b6c" path="/var/lib/kubelet/pods/c107007e-46bb-4d36-a899-18b499685b6c/volumes" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.642087 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzxr6\" (UniqueName: \"kubernetes.io/projected/60557b75-bafe-4e92-937c-e541b84aaf70-kube-api-access-gzxr6\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.642141 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-scripts\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.642294 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.642362 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-config-data\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.642406 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.642437 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-run-httpd\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.642468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-log-httpd\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.743991 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-config-data\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.744467 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.744505 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-run-httpd\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.744530 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-log-httpd\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.744616 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzxr6\" (UniqueName: \"kubernetes.io/projected/60557b75-bafe-4e92-937c-e541b84aaf70-kube-api-access-gzxr6\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.744640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-scripts\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.744728 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.745503 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-log-httpd\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.745515 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-run-httpd\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.748591 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.749801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-config-data\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.751138 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.761558 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-scripts\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.768018 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzxr6\" (UniqueName: \"kubernetes.io/projected/60557b75-bafe-4e92-937c-e541b84aaf70-kube-api-access-gzxr6\") pod \"ceilometer-0\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " pod="openstack/ceilometer-0" Feb 19 21:47:55 crc kubenswrapper[4795]: I0219 21:47:55.805484 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:47:56 crc kubenswrapper[4795]: I0219 21:47:56.301965 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:47:56 crc kubenswrapper[4795]: I0219 21:47:56.396997 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerStarted","Data":"2512790f4175863e7da7d55dc8c6ebb57bbf253fa486687fa567e90d3c41dca6"} Feb 19 21:47:58 crc kubenswrapper[4795]: I0219 21:47:58.438318 4795 generic.go:334] "Generic (PLEG): container finished" podID="ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" containerID="820a0240f9371635d4e5f03ad2ffcfa48ce070182eb86809479efd07b7626507" exitCode=0 Feb 19 21:47:58 crc kubenswrapper[4795]: I0219 21:47:58.438400 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" event={"ID":"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f","Type":"ContainerDied","Data":"820a0240f9371635d4e5f03ad2ffcfa48ce070182eb86809479efd07b7626507"} Feb 19 21:47:58 crc kubenswrapper[4795]: I0219 21:47:58.447133 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerStarted","Data":"1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8"} Feb 19 21:47:58 crc kubenswrapper[4795]: I0219 21:47:58.447233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerStarted","Data":"8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b"} Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.457433 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerStarted","Data":"bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da"} Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.858808 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.924872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-scripts\") pod \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.925037 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhcgt\" (UniqueName: \"kubernetes.io/projected/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-kube-api-access-zhcgt\") pod \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.925078 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-config-data\") pod \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.925248 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-combined-ca-bundle\") pod \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\" (UID: \"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f\") " Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.930698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-scripts" (OuterVolumeSpecName: "scripts") pod "ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" (UID: "ef98a0b8-d6d9-4075-ae60-e7d614a79e7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.934203 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-kube-api-access-zhcgt" (OuterVolumeSpecName: "kube-api-access-zhcgt") pod "ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" (UID: "ef98a0b8-d6d9-4075-ae60-e7d614a79e7f"). InnerVolumeSpecName "kube-api-access-zhcgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.952309 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" (UID: "ef98a0b8-d6d9-4075-ae60-e7d614a79e7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:47:59 crc kubenswrapper[4795]: I0219 21:47:59.957292 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-config-data" (OuterVolumeSpecName: "config-data") pod "ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" (UID: "ef98a0b8-d6d9-4075-ae60-e7d614a79e7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.027048 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.027071 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhcgt\" (UniqueName: \"kubernetes.io/projected/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-kube-api-access-zhcgt\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.027082 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.027091 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.473363 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" event={"ID":"ef98a0b8-d6d9-4075-ae60-e7d614a79e7f","Type":"ContainerDied","Data":"0d7e8c41b27cf2ce2231d19e47302fd30c5de4373894fc72062d69152981c2ae"} Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.473662 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d7e8c41b27cf2ce2231d19e47302fd30c5de4373894fc72062d69152981c2ae" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.473405 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7ssdv" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.481226 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerStarted","Data":"f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b"} Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.481456 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.519511 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.832063113 podStartE2EDuration="5.519493402s" podCreationTimestamp="2026-02-19 21:47:55 +0000 UTC" firstStartedPulling="2026-02-19 21:47:56.311987671 +0000 UTC m=+1187.504505535" lastFinishedPulling="2026-02-19 21:47:59.99941796 +0000 UTC m=+1191.191935824" observedRunningTime="2026-02-19 21:48:00.513105972 +0000 UTC m=+1191.705623836" watchObservedRunningTime="2026-02-19 21:48:00.519493402 +0000 UTC m=+1191.712011286" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.563449 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:48:00 crc kubenswrapper[4795]: E0219 21:48:00.563818 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" containerName="nova-cell0-conductor-db-sync" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.563836 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" containerName="nova-cell0-conductor-db-sync" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.563988 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" containerName="nova-cell0-conductor-db-sync" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.564537 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.566758 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.566820 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-b4vdh" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.581954 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.637418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.637487 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v44bq\" (UniqueName: \"kubernetes.io/projected/57b83043-2f7c-4b55-a2b9-66eef96f0008-kube-api-access-v44bq\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.637532 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.739038 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.739120 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v44bq\" (UniqueName: \"kubernetes.io/projected/57b83043-2f7c-4b55-a2b9-66eef96f0008-kube-api-access-v44bq\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.739189 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.743232 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.744699 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.761121 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v44bq\" (UniqueName: \"kubernetes.io/projected/57b83043-2f7c-4b55-a2b9-66eef96f0008-kube-api-access-v44bq\") pod \"nova-cell0-conductor-0\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:00 crc kubenswrapper[4795]: I0219 21:48:00.888238 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:01 crc kubenswrapper[4795]: I0219 21:48:01.371073 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:48:01 crc kubenswrapper[4795]: I0219 21:48:01.492985 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"57b83043-2f7c-4b55-a2b9-66eef96f0008","Type":"ContainerStarted","Data":"83f719d65e236fae031c225d4f8065a2b4c198be5a5993edbe70af70bfebe600"} Feb 19 21:48:02 crc kubenswrapper[4795]: I0219 21:48:02.502643 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"57b83043-2f7c-4b55-a2b9-66eef96f0008","Type":"ContainerStarted","Data":"4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6"} Feb 19 21:48:02 crc kubenswrapper[4795]: I0219 21:48:02.503107 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:02 crc kubenswrapper[4795]: I0219 21:48:02.535224 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.535156602 podStartE2EDuration="2.535156602s" podCreationTimestamp="2026-02-19 21:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:02.524319177 +0000 UTC m=+1193.716837031" watchObservedRunningTime="2026-02-19 21:48:02.535156602 +0000 UTC m=+1193.727674466" Feb 19 21:48:10 crc kubenswrapper[4795]: I0219 21:48:10.938465 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.440493 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-x4mls"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.442346 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.445268 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.445277 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.468492 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-x4mls"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.582218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.582305 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-config-data\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.582426 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qggxc\" (UniqueName: \"kubernetes.io/projected/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-kube-api-access-qggxc\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.582459 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-scripts\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.611545 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.612891 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.615903 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.637514 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.684541 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.684611 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-config-data\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.684646 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-config-data\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.684732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qggxc\" (UniqueName: \"kubernetes.io/projected/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-kube-api-access-qggxc\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.684765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-scripts\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.684835 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1207388e-c327-4eb4-b81f-dee124375ca8-logs\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.684879 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v6ct\" (UniqueName: \"kubernetes.io/projected/1207388e-c327-4eb4-b81f-dee124375ca8-kube-api-access-8v6ct\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.684921 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.705041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-config-data\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.705439 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-scripts\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.705504 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.715375 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.717910 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.719035 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qggxc\" (UniqueName: \"kubernetes.io/projected/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-kube-api-access-qggxc\") pod \"nova-cell0-cell-mapping-x4mls\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.723521 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.731859 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.762604 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.765247 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.766531 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.770840 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.786251 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-config-data\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.786317 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.786348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.786385 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-config-data\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.786453 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9jbt\" (UniqueName: \"kubernetes.io/projected/d5aecdca-84a6-4987-8b84-a95fbb0096f9-kube-api-access-w9jbt\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.786517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1207388e-c327-4eb4-b81f-dee124375ca8-logs\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.786541 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v6ct\" (UniqueName: \"kubernetes.io/projected/1207388e-c327-4eb4-b81f-dee124375ca8-kube-api-access-8v6ct\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.799847 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1207388e-c327-4eb4-b81f-dee124375ca8-logs\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.800650 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.804424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-config-data\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.832830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v6ct\" (UniqueName: \"kubernetes.io/projected/1207388e-c327-4eb4-b81f-dee124375ca8-kube-api-access-8v6ct\") pod \"nova-api-0\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.835222 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.861629 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.863312 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.874703 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.888544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-config-data\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.888586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.888683 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.888710 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9jbt\" (UniqueName: \"kubernetes.io/projected/d5aecdca-84a6-4987-8b84-a95fbb0096f9-kube-api-access-w9jbt\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.888740 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29qhd\" (UniqueName: \"kubernetes.io/projected/4dabff2c-427b-4307-b949-23fdde980292-kube-api-access-29qhd\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.888792 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.907810 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-config-data\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.910633 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.914531 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9jbt\" (UniqueName: \"kubernetes.io/projected/d5aecdca-84a6-4987-8b84-a95fbb0096f9-kube-api-access-w9jbt\") pod \"nova-scheduler-0\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.914592 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.923423 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-f5qcb"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.925268 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.930528 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.940918 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-f5qcb"] Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.957545 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.989926 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lht4g\" (UniqueName: \"kubernetes.io/projected/c55a0492-b022-4514-85d5-d35d3ec46f05-kube-api-access-lht4g\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.989964 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-config-data\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.989987 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c55a0492-b022-4514-85d5-d35d3ec46f05-logs\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990241 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-svc\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990310 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990394 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29qhd\" (UniqueName: \"kubernetes.io/projected/4dabff2c-427b-4307-b949-23fdde980292-kube-api-access-29qhd\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-config\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990502 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-sb\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990659 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-nb\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990708 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zrmm\" (UniqueName: \"kubernetes.io/projected/59d807ee-6555-4c2f-8598-9f264d5a95f9-kube-api-access-5zrmm\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990732 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-swift-storage-0\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.990756 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.993870 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:11 crc kubenswrapper[4795]: I0219 21:48:11.995657 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.006194 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29qhd\" (UniqueName: \"kubernetes.io/projected/4dabff2c-427b-4307-b949-23fdde980292-kube-api-access-29qhd\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092690 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-svc\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092790 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-config\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-sb\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092877 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-nb\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092903 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zrmm\" (UniqueName: \"kubernetes.io/projected/59d807ee-6555-4c2f-8598-9f264d5a95f9-kube-api-access-5zrmm\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092930 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-swift-storage-0\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092959 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092983 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lht4g\" (UniqueName: \"kubernetes.io/projected/c55a0492-b022-4514-85d5-d35d3ec46f05-kube-api-access-lht4g\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.092999 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-config-data\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.093033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c55a0492-b022-4514-85d5-d35d3ec46f05-logs\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.093750 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c55a0492-b022-4514-85d5-d35d3ec46f05-logs\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.094501 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-swift-storage-0\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.095844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-config\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.096302 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-nb\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.096479 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-sb\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.097141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-svc\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.100553 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-config-data\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.101313 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.111852 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zrmm\" (UniqueName: \"kubernetes.io/projected/59d807ee-6555-4c2f-8598-9f264d5a95f9-kube-api-access-5zrmm\") pod \"dnsmasq-dns-75ddbf7c75-f5qcb\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.114248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lht4g\" (UniqueName: \"kubernetes.io/projected/c55a0492-b022-4514-85d5-d35d3ec46f05-kube-api-access-lht4g\") pod \"nova-metadata-0\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.272141 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.288968 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.296756 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.420918 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-x4mls"] Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.535153 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.608316 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x4mls" event={"ID":"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd","Type":"ContainerStarted","Data":"88f56eb2aa26f82d9b17df46fcb912c867ec3d628b4dff9a39f509961e5df80b"} Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.621764 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1207388e-c327-4eb4-b81f-dee124375ca8","Type":"ContainerStarted","Data":"70232baf2a46181b0fb51eefd50334fc1763134daec5b9978cd7cc19312a07a8"} Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.626361 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gxh8d"] Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.627529 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.630379 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.630633 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.640152 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.656997 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gxh8d"] Feb 19 21:48:12 crc kubenswrapper[4795]: W0219 21:48:12.657870 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5aecdca_84a6_4987_8b84_a95fbb0096f9.slice/crio-13cacc8b3b7ac3be0ed861e620fc89ce3271fa1dd8179aa143ee4013f7bace77 WatchSource:0}: Error finding container 13cacc8b3b7ac3be0ed861e620fc89ce3271fa1dd8179aa143ee4013f7bace77: Status 404 returned error can't find the container with id 13cacc8b3b7ac3be0ed861e620fc89ce3271fa1dd8179aa143ee4013f7bace77 Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.693156 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.710889 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrmdc\" (UniqueName: \"kubernetes.io/projected/fd8e89cd-b890-4f36-9008-59767ccbad91-kube-api-access-rrmdc\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.710936 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-config-data\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.710971 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.711009 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-scripts\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.812964 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-scripts\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.813123 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrmdc\" (UniqueName: \"kubernetes.io/projected/fd8e89cd-b890-4f36-9008-59767ccbad91-kube-api-access-rrmdc\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.813182 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-config-data\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.813229 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.822633 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.835100 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-config-data\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.840796 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-scripts\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.843171 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.843223 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrmdc\" (UniqueName: \"kubernetes.io/projected/fd8e89cd-b890-4f36-9008-59767ccbad91-kube-api-access-rrmdc\") pod \"nova-cell1-conductor-db-sync-gxh8d\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.954071 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:12 crc kubenswrapper[4795]: I0219 21:48:12.993279 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-f5qcb"] Feb 19 21:48:13 crc kubenswrapper[4795]: W0219 21:48:13.006503 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59d807ee_6555_4c2f_8598_9f264d5a95f9.slice/crio-919c6d3f60ce379525cac92e740a7e9dd73e2c7e47ddc1efebc10b6306ebfc94 WatchSource:0}: Error finding container 919c6d3f60ce379525cac92e740a7e9dd73e2c7e47ddc1efebc10b6306ebfc94: Status 404 returned error can't find the container with id 919c6d3f60ce379525cac92e740a7e9dd73e2c7e47ddc1efebc10b6306ebfc94 Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.426246 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gxh8d"] Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.634028 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4dabff2c-427b-4307-b949-23fdde980292","Type":"ContainerStarted","Data":"7a24e1fe5ec311d17f9a97363d6465ddfdef8ead02652c4242800fc85c6ff620"} Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.636728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x4mls" event={"ID":"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd","Type":"ContainerStarted","Data":"54158f46698b83516d68eee860a71369629e83ec491495039fcc2f5f6a5bd317"} Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.639413 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c55a0492-b022-4514-85d5-d35d3ec46f05","Type":"ContainerStarted","Data":"9eb4a81c900f6ce7b3c066bf56f65f67c3231408465f0d632475d3104e06db3a"} Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.642481 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d5aecdca-84a6-4987-8b84-a95fbb0096f9","Type":"ContainerStarted","Data":"13cacc8b3b7ac3be0ed861e620fc89ce3271fa1dd8179aa143ee4013f7bace77"} Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.643968 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" event={"ID":"fd8e89cd-b890-4f36-9008-59767ccbad91","Type":"ContainerStarted","Data":"199857e63a023ba2f3b45450f39bdc22aaf434e44f27b4974099a3d7f5c19db9"} Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.649519 4795 generic.go:334] "Generic (PLEG): container finished" podID="59d807ee-6555-4c2f-8598-9f264d5a95f9" containerID="c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895" exitCode=0 Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.649573 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" event={"ID":"59d807ee-6555-4c2f-8598-9f264d5a95f9","Type":"ContainerDied","Data":"c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895"} Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.649598 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" event={"ID":"59d807ee-6555-4c2f-8598-9f264d5a95f9","Type":"ContainerStarted","Data":"919c6d3f60ce379525cac92e740a7e9dd73e2c7e47ddc1efebc10b6306ebfc94"} Feb 19 21:48:13 crc kubenswrapper[4795]: I0219 21:48:13.652404 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-x4mls" podStartSLOduration=2.652389391 podStartE2EDuration="2.652389391s" podCreationTimestamp="2026-02-19 21:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:13.649564192 +0000 UTC m=+1204.842082056" watchObservedRunningTime="2026-02-19 21:48:13.652389391 +0000 UTC m=+1204.844907255" Feb 19 21:48:14 crc kubenswrapper[4795]: I0219 21:48:14.668856 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" event={"ID":"59d807ee-6555-4c2f-8598-9f264d5a95f9","Type":"ContainerStarted","Data":"fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f"} Feb 19 21:48:14 crc kubenswrapper[4795]: I0219 21:48:14.670231 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:14 crc kubenswrapper[4795]: I0219 21:48:14.675222 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" event={"ID":"fd8e89cd-b890-4f36-9008-59767ccbad91","Type":"ContainerStarted","Data":"971150e9373b5419fd5efc7fc3e78faafd9906f987cb7e0a9e042f04e22cb5a9"} Feb 19 21:48:14 crc kubenswrapper[4795]: I0219 21:48:14.694040 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" podStartSLOduration=3.694018026 podStartE2EDuration="3.694018026s" podCreationTimestamp="2026-02-19 21:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:14.692317528 +0000 UTC m=+1205.884835392" watchObservedRunningTime="2026-02-19 21:48:14.694018026 +0000 UTC m=+1205.886535890" Feb 19 21:48:14 crc kubenswrapper[4795]: I0219 21:48:14.711761 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" podStartSLOduration=2.711745594 podStartE2EDuration="2.711745594s" podCreationTimestamp="2026-02-19 21:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:14.707907076 +0000 UTC m=+1205.900424940" watchObservedRunningTime="2026-02-19 21:48:14.711745594 +0000 UTC m=+1205.904263458" Feb 19 21:48:15 crc kubenswrapper[4795]: I0219 21:48:15.545234 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:15 crc kubenswrapper[4795]: I0219 21:48:15.556204 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.694128 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4dabff2c-427b-4307-b949-23fdde980292","Type":"ContainerStarted","Data":"0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45"} Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.694690 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4dabff2c-427b-4307-b949-23fdde980292" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45" gracePeriod=30 Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.696409 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c55a0492-b022-4514-85d5-d35d3ec46f05","Type":"ContainerStarted","Data":"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe"} Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.696458 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c55a0492-b022-4514-85d5-d35d3ec46f05","Type":"ContainerStarted","Data":"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596"} Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.696613 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerName="nova-metadata-log" containerID="cri-o://2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596" gracePeriod=30 Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.696727 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerName="nova-metadata-metadata" containerID="cri-o://2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe" gracePeriod=30 Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.699047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d5aecdca-84a6-4987-8b84-a95fbb0096f9","Type":"ContainerStarted","Data":"6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6"} Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.709403 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1207388e-c327-4eb4-b81f-dee124375ca8","Type":"ContainerStarted","Data":"70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117"} Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.709434 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1207388e-c327-4eb4-b81f-dee124375ca8","Type":"ContainerStarted","Data":"2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc"} Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.712883 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.964727427 podStartE2EDuration="5.712864747s" podCreationTimestamp="2026-02-19 21:48:11 +0000 UTC" firstStartedPulling="2026-02-19 21:48:12.827740463 +0000 UTC m=+1204.020258327" lastFinishedPulling="2026-02-19 21:48:15.575877783 +0000 UTC m=+1206.768395647" observedRunningTime="2026-02-19 21:48:16.710387857 +0000 UTC m=+1207.902905721" watchObservedRunningTime="2026-02-19 21:48:16.712864747 +0000 UTC m=+1207.905382611" Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.733814 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.9415001050000003 podStartE2EDuration="5.733793835s" podCreationTimestamp="2026-02-19 21:48:11 +0000 UTC" firstStartedPulling="2026-02-19 21:48:12.714366068 +0000 UTC m=+1203.906883932" lastFinishedPulling="2026-02-19 21:48:15.506659798 +0000 UTC m=+1206.699177662" observedRunningTime="2026-02-19 21:48:16.733497996 +0000 UTC m=+1207.926015860" watchObservedRunningTime="2026-02-19 21:48:16.733793835 +0000 UTC m=+1207.926311699" Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.757118 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.736723271 podStartE2EDuration="5.757099019s" podCreationTimestamp="2026-02-19 21:48:11 +0000 UTC" firstStartedPulling="2026-02-19 21:48:12.555628388 +0000 UTC m=+1203.748146252" lastFinishedPulling="2026-02-19 21:48:15.576004136 +0000 UTC m=+1206.768522000" observedRunningTime="2026-02-19 21:48:16.751279485 +0000 UTC m=+1207.943797369" watchObservedRunningTime="2026-02-19 21:48:16.757099019 +0000 UTC m=+1207.949616883" Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.770724 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.927286374 podStartE2EDuration="5.770708191s" podCreationTimestamp="2026-02-19 21:48:11 +0000 UTC" firstStartedPulling="2026-02-19 21:48:12.664442765 +0000 UTC m=+1203.856960629" lastFinishedPulling="2026-02-19 21:48:15.507864582 +0000 UTC m=+1206.700382446" observedRunningTime="2026-02-19 21:48:16.765911936 +0000 UTC m=+1207.958429800" watchObservedRunningTime="2026-02-19 21:48:16.770708191 +0000 UTC m=+1207.963226055" Feb 19 21:48:16 crc kubenswrapper[4795]: I0219 21:48:16.958479 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.272787 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.291280 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.291343 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.345114 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.532603 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lht4g\" (UniqueName: \"kubernetes.io/projected/c55a0492-b022-4514-85d5-d35d3ec46f05-kube-api-access-lht4g\") pod \"c55a0492-b022-4514-85d5-d35d3ec46f05\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.532679 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c55a0492-b022-4514-85d5-d35d3ec46f05-logs\") pod \"c55a0492-b022-4514-85d5-d35d3ec46f05\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.532835 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-config-data\") pod \"c55a0492-b022-4514-85d5-d35d3ec46f05\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.532891 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-combined-ca-bundle\") pod \"c55a0492-b022-4514-85d5-d35d3ec46f05\" (UID: \"c55a0492-b022-4514-85d5-d35d3ec46f05\") " Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.533700 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c55a0492-b022-4514-85d5-d35d3ec46f05-logs" (OuterVolumeSpecName: "logs") pod "c55a0492-b022-4514-85d5-d35d3ec46f05" (UID: "c55a0492-b022-4514-85d5-d35d3ec46f05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.553977 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c55a0492-b022-4514-85d5-d35d3ec46f05-kube-api-access-lht4g" (OuterVolumeSpecName: "kube-api-access-lht4g") pod "c55a0492-b022-4514-85d5-d35d3ec46f05" (UID: "c55a0492-b022-4514-85d5-d35d3ec46f05"). InnerVolumeSpecName "kube-api-access-lht4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.561429 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-config-data" (OuterVolumeSpecName: "config-data") pod "c55a0492-b022-4514-85d5-d35d3ec46f05" (UID: "c55a0492-b022-4514-85d5-d35d3ec46f05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.570907 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c55a0492-b022-4514-85d5-d35d3ec46f05" (UID: "c55a0492-b022-4514-85d5-d35d3ec46f05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.635258 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lht4g\" (UniqueName: \"kubernetes.io/projected/c55a0492-b022-4514-85d5-d35d3ec46f05-kube-api-access-lht4g\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.635295 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c55a0492-b022-4514-85d5-d35d3ec46f05-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.635308 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.635322 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55a0492-b022-4514-85d5-d35d3ec46f05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.722381 4795 generic.go:334] "Generic (PLEG): container finished" podID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerID="2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe" exitCode=0 Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.722409 4795 generic.go:334] "Generic (PLEG): container finished" podID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerID="2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596" exitCode=143 Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.722428 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c55a0492-b022-4514-85d5-d35d3ec46f05","Type":"ContainerDied","Data":"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe"} Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.722489 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.722506 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c55a0492-b022-4514-85d5-d35d3ec46f05","Type":"ContainerDied","Data":"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596"} Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.722531 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c55a0492-b022-4514-85d5-d35d3ec46f05","Type":"ContainerDied","Data":"9eb4a81c900f6ce7b3c066bf56f65f67c3231408465f0d632475d3104e06db3a"} Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.722561 4795 scope.go:117] "RemoveContainer" containerID="2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.761106 4795 scope.go:117] "RemoveContainer" containerID="2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.786771 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.788432 4795 scope.go:117] "RemoveContainer" containerID="2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe" Feb 19 21:48:17 crc kubenswrapper[4795]: E0219 21:48:17.788885 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe\": container with ID starting with 2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe not found: ID does not exist" containerID="2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.788931 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe"} err="failed to get container status \"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe\": rpc error: code = NotFound desc = could not find container \"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe\": container with ID starting with 2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe not found: ID does not exist" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.788958 4795 scope.go:117] "RemoveContainer" containerID="2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596" Feb 19 21:48:17 crc kubenswrapper[4795]: E0219 21:48:17.789377 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596\": container with ID starting with 2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596 not found: ID does not exist" containerID="2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.789417 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596"} err="failed to get container status \"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596\": rpc error: code = NotFound desc = could not find container \"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596\": container with ID starting with 2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596 not found: ID does not exist" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.789441 4795 scope.go:117] "RemoveContainer" containerID="2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.789757 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe"} err="failed to get container status \"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe\": rpc error: code = NotFound desc = could not find container \"2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe\": container with ID starting with 2f6aff4ac645628fb823f2d71124e062624305f8587485413fc9b563acf03bbe not found: ID does not exist" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.789791 4795 scope.go:117] "RemoveContainer" containerID="2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.790156 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596"} err="failed to get container status \"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596\": rpc error: code = NotFound desc = could not find container \"2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596\": container with ID starting with 2a739b06c788943900a9e55ea75887127270b3bfda7bb2ff0dd2d55b80e6c596 not found: ID does not exist" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.811254 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.825195 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:17 crc kubenswrapper[4795]: E0219 21:48:17.825599 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerName="nova-metadata-log" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.825615 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerName="nova-metadata-log" Feb 19 21:48:17 crc kubenswrapper[4795]: E0219 21:48:17.825637 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerName="nova-metadata-metadata" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.825643 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerName="nova-metadata-metadata" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.825797 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerName="nova-metadata-log" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.825813 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" containerName="nova-metadata-metadata" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.826720 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.829309 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.829567 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.834583 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.941185 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3c4594-ea43-437d-8528-fd360fd4c4f9-logs\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.941229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5hmb\" (UniqueName: \"kubernetes.io/projected/3a3c4594-ea43-437d-8528-fd360fd4c4f9-kube-api-access-f5hmb\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.941274 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.941300 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:17 crc kubenswrapper[4795]: I0219 21:48:17.941345 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-config-data\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.042862 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5hmb\" (UniqueName: \"kubernetes.io/projected/3a3c4594-ea43-437d-8528-fd360fd4c4f9-kube-api-access-f5hmb\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.042933 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.042963 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.043004 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-config-data\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.043088 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3c4594-ea43-437d-8528-fd360fd4c4f9-logs\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.043485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3c4594-ea43-437d-8528-fd360fd4c4f9-logs\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.048557 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.062053 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5hmb\" (UniqueName: \"kubernetes.io/projected/3a3c4594-ea43-437d-8528-fd360fd4c4f9-kube-api-access-f5hmb\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.062589 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-config-data\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.072084 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.147568 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.640814 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:18 crc kubenswrapper[4795]: I0219 21:48:18.736323 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a3c4594-ea43-437d-8528-fd360fd4c4f9","Type":"ContainerStarted","Data":"79584481ab899a406532b67e6a609dc80a7856d39f0219a0cc1743be67cc18e6"} Feb 19 21:48:19 crc kubenswrapper[4795]: I0219 21:48:19.528706 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c55a0492-b022-4514-85d5-d35d3ec46f05" path="/var/lib/kubelet/pods/c55a0492-b022-4514-85d5-d35d3ec46f05/volumes" Feb 19 21:48:19 crc kubenswrapper[4795]: I0219 21:48:19.747373 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a3c4594-ea43-437d-8528-fd360fd4c4f9","Type":"ContainerStarted","Data":"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1"} Feb 19 21:48:19 crc kubenswrapper[4795]: I0219 21:48:19.747429 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a3c4594-ea43-437d-8528-fd360fd4c4f9","Type":"ContainerStarted","Data":"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f"} Feb 19 21:48:19 crc kubenswrapper[4795]: I0219 21:48:19.790639 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.790579475 podStartE2EDuration="2.790579475s" podCreationTimestamp="2026-02-19 21:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:19.776369096 +0000 UTC m=+1210.968887000" watchObservedRunningTime="2026-02-19 21:48:19.790579475 +0000 UTC m=+1210.983097369" Feb 19 21:48:20 crc kubenswrapper[4795]: I0219 21:48:20.765133 4795 generic.go:334] "Generic (PLEG): container finished" podID="f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" containerID="54158f46698b83516d68eee860a71369629e83ec491495039fcc2f5f6a5bd317" exitCode=0 Feb 19 21:48:20 crc kubenswrapper[4795]: I0219 21:48:20.765266 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x4mls" event={"ID":"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd","Type":"ContainerDied","Data":"54158f46698b83516d68eee860a71369629e83ec491495039fcc2f5f6a5bd317"} Feb 19 21:48:20 crc kubenswrapper[4795]: I0219 21:48:20.769264 4795 generic.go:334] "Generic (PLEG): container finished" podID="fd8e89cd-b890-4f36-9008-59767ccbad91" containerID="971150e9373b5419fd5efc7fc3e78faafd9906f987cb7e0a9e042f04e22cb5a9" exitCode=0 Feb 19 21:48:20 crc kubenswrapper[4795]: I0219 21:48:20.769307 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" event={"ID":"fd8e89cd-b890-4f36-9008-59767ccbad91","Type":"ContainerDied","Data":"971150e9373b5419fd5efc7fc3e78faafd9906f987cb7e0a9e042f04e22cb5a9"} Feb 19 21:48:21 crc kubenswrapper[4795]: I0219 21:48:21.931923 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:48:21 crc kubenswrapper[4795]: I0219 21:48:21.931989 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:48:21 crc kubenswrapper[4795]: I0219 21:48:21.958972 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.022648 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.270814 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.279768 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.298365 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.372967 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-wgth2"] Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.373599 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" podUID="df387754-5537-4d85-950b-02743c881da8" containerName="dnsmasq-dns" containerID="cri-o://70af76578d539ade05715b4f45e530828d624afd3981206c0f1b5d67746b15d9" gracePeriod=10 Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.444599 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-config-data\") pod \"fd8e89cd-b890-4f36-9008-59767ccbad91\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.444682 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrmdc\" (UniqueName: \"kubernetes.io/projected/fd8e89cd-b890-4f36-9008-59767ccbad91-kube-api-access-rrmdc\") pod \"fd8e89cd-b890-4f36-9008-59767ccbad91\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.444795 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-combined-ca-bundle\") pod \"fd8e89cd-b890-4f36-9008-59767ccbad91\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.444846 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qggxc\" (UniqueName: \"kubernetes.io/projected/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-kube-api-access-qggxc\") pod \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.444865 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-scripts\") pod \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.444885 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-scripts\") pod \"fd8e89cd-b890-4f36-9008-59767ccbad91\" (UID: \"fd8e89cd-b890-4f36-9008-59767ccbad91\") " Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.444922 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-config-data\") pod \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.445031 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-combined-ca-bundle\") pod \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\" (UID: \"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd\") " Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.460921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd8e89cd-b890-4f36-9008-59767ccbad91-kube-api-access-rrmdc" (OuterVolumeSpecName: "kube-api-access-rrmdc") pod "fd8e89cd-b890-4f36-9008-59767ccbad91" (UID: "fd8e89cd-b890-4f36-9008-59767ccbad91"). InnerVolumeSpecName "kube-api-access-rrmdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.460987 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-scripts" (OuterVolumeSpecName: "scripts") pod "f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" (UID: "f8cadc24-23ed-4063-8e1a-47a27c1d6ffd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.471239 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-scripts" (OuterVolumeSpecName: "scripts") pod "fd8e89cd-b890-4f36-9008-59767ccbad91" (UID: "fd8e89cd-b890-4f36-9008-59767ccbad91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.475829 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-kube-api-access-qggxc" (OuterVolumeSpecName: "kube-api-access-qggxc") pod "f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" (UID: "f8cadc24-23ed-4063-8e1a-47a27c1d6ffd"). InnerVolumeSpecName "kube-api-access-qggxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.506002 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-config-data" (OuterVolumeSpecName: "config-data") pod "fd8e89cd-b890-4f36-9008-59767ccbad91" (UID: "fd8e89cd-b890-4f36-9008-59767ccbad91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.509356 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" (UID: "f8cadc24-23ed-4063-8e1a-47a27c1d6ffd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.529450 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd8e89cd-b890-4f36-9008-59767ccbad91" (UID: "fd8e89cd-b890-4f36-9008-59767ccbad91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.541271 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-config-data" (OuterVolumeSpecName: "config-data") pod "f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" (UID: "f8cadc24-23ed-4063-8e1a-47a27c1d6ffd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.548493 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.548540 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrmdc\" (UniqueName: \"kubernetes.io/projected/fd8e89cd-b890-4f36-9008-59767ccbad91-kube-api-access-rrmdc\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.548555 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.548568 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qggxc\" (UniqueName: \"kubernetes.io/projected/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-kube-api-access-qggxc\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.548578 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.548588 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd8e89cd-b890-4f36-9008-59767ccbad91-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.548599 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.548607 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.788285 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x4mls" event={"ID":"f8cadc24-23ed-4063-8e1a-47a27c1d6ffd","Type":"ContainerDied","Data":"88f56eb2aa26f82d9b17df46fcb912c867ec3d628b4dff9a39f509961e5df80b"} Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.788620 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88f56eb2aa26f82d9b17df46fcb912c867ec3d628b4dff9a39f509961e5df80b" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.788697 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x4mls" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.798048 4795 generic.go:334] "Generic (PLEG): container finished" podID="df387754-5537-4d85-950b-02743c881da8" containerID="70af76578d539ade05715b4f45e530828d624afd3981206c0f1b5d67746b15d9" exitCode=0 Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.798103 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" event={"ID":"df387754-5537-4d85-950b-02743c881da8","Type":"ContainerDied","Data":"70af76578d539ade05715b4f45e530828d624afd3981206c0f1b5d67746b15d9"} Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.800483 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.800569 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gxh8d" event={"ID":"fd8e89cd-b890-4f36-9008-59767ccbad91","Type":"ContainerDied","Data":"199857e63a023ba2f3b45450f39bdc22aaf434e44f27b4974099a3d7f5c19db9"} Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.800590 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="199857e63a023ba2f3b45450f39bdc22aaf434e44f27b4974099a3d7f5c19db9" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.858820 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.893817 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.910279 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:48:22 crc kubenswrapper[4795]: E0219 21:48:22.910939 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df387754-5537-4d85-950b-02743c881da8" containerName="init" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.911156 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="df387754-5537-4d85-950b-02743c881da8" containerName="init" Feb 19 21:48:22 crc kubenswrapper[4795]: E0219 21:48:22.911464 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd8e89cd-b890-4f36-9008-59767ccbad91" containerName="nova-cell1-conductor-db-sync" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.911527 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd8e89cd-b890-4f36-9008-59767ccbad91" containerName="nova-cell1-conductor-db-sync" Feb 19 21:48:22 crc kubenswrapper[4795]: E0219 21:48:22.911605 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df387754-5537-4d85-950b-02743c881da8" containerName="dnsmasq-dns" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.911669 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="df387754-5537-4d85-950b-02743c881da8" containerName="dnsmasq-dns" Feb 19 21:48:22 crc kubenswrapper[4795]: E0219 21:48:22.911750 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" containerName="nova-manage" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.911822 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" containerName="nova-manage" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.912071 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" containerName="nova-manage" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.912182 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="df387754-5537-4d85-950b-02743c881da8" containerName="dnsmasq-dns" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.912245 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd8e89cd-b890-4f36-9008-59767ccbad91" containerName="nova-cell1-conductor-db-sync" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.912924 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.914653 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.933277 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.990693 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.990977 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-log" containerID="cri-o://2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc" gracePeriod=30 Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.991112 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-api" containerID="cri-o://70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117" gracePeriod=30 Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.995141 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 21:48:22 crc kubenswrapper[4795]: I0219 21:48:22.995316 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.056718 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.056957 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerName="nova-metadata-log" containerID="cri-o://dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f" gracePeriod=30 Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.057280 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerName="nova-metadata-metadata" containerID="cri-o://b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1" gracePeriod=30 Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.071303 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-sb\") pod \"df387754-5537-4d85-950b-02743c881da8\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.071620 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hb2s\" (UniqueName: \"kubernetes.io/projected/df387754-5537-4d85-950b-02743c881da8-kube-api-access-2hb2s\") pod \"df387754-5537-4d85-950b-02743c881da8\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.071713 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-nb\") pod \"df387754-5537-4d85-950b-02743c881da8\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.071826 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-svc\") pod \"df387754-5537-4d85-950b-02743c881da8\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.071918 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-swift-storage-0\") pod \"df387754-5537-4d85-950b-02743c881da8\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.072022 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-config\") pod \"df387754-5537-4d85-950b-02743c881da8\" (UID: \"df387754-5537-4d85-950b-02743c881da8\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.072342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtr87\" (UniqueName: \"kubernetes.io/projected/f6fd7841-2a08-4786-8e96-b2ab0f477eff-kube-api-access-qtr87\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.072481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.072609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.080531 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df387754-5537-4d85-950b-02743c881da8-kube-api-access-2hb2s" (OuterVolumeSpecName: "kube-api-access-2hb2s") pod "df387754-5537-4d85-950b-02743c881da8" (UID: "df387754-5537-4d85-950b-02743c881da8"). InnerVolumeSpecName "kube-api-access-2hb2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.129067 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df387754-5537-4d85-950b-02743c881da8" (UID: "df387754-5537-4d85-950b-02743c881da8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.129080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-config" (OuterVolumeSpecName: "config") pod "df387754-5537-4d85-950b-02743c881da8" (UID: "df387754-5537-4d85-950b-02743c881da8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.129415 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df387754-5537-4d85-950b-02743c881da8" (UID: "df387754-5537-4d85-950b-02743c881da8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.132561 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df387754-5537-4d85-950b-02743c881da8" (UID: "df387754-5537-4d85-950b-02743c881da8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.135894 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "df387754-5537-4d85-950b-02743c881da8" (UID: "df387754-5537-4d85-950b-02743c881da8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.148670 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.148895 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174054 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtr87\" (UniqueName: \"kubernetes.io/projected/f6fd7841-2a08-4786-8e96-b2ab0f477eff-kube-api-access-qtr87\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174144 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174218 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174263 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174274 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hb2s\" (UniqueName: \"kubernetes.io/projected/df387754-5537-4d85-950b-02743c881da8-kube-api-access-2hb2s\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174283 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174291 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174299 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.174308 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df387754-5537-4d85-950b-02743c881da8-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.182793 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.183938 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.193777 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtr87\" (UniqueName: \"kubernetes.io/projected/f6fd7841-2a08-4786-8e96-b2ab0f477eff-kube-api-access-qtr87\") pod \"nova-cell1-conductor-0\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.229254 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.402841 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.639870 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.801052 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-nova-metadata-tls-certs\") pod \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.802385 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-combined-ca-bundle\") pod \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.802792 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-config-data\") pod \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.802937 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5hmb\" (UniqueName: \"kubernetes.io/projected/3a3c4594-ea43-437d-8528-fd360fd4c4f9-kube-api-access-f5hmb\") pod \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.803058 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3c4594-ea43-437d-8528-fd360fd4c4f9-logs\") pod \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\" (UID: \"3a3c4594-ea43-437d-8528-fd360fd4c4f9\") " Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.803977 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a3c4594-ea43-437d-8528-fd360fd4c4f9-logs" (OuterVolumeSpecName: "logs") pod "3a3c4594-ea43-437d-8528-fd360fd4c4f9" (UID: "3a3c4594-ea43-437d-8528-fd360fd4c4f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.808491 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a3c4594-ea43-437d-8528-fd360fd4c4f9-kube-api-access-f5hmb" (OuterVolumeSpecName: "kube-api-access-f5hmb") pod "3a3c4594-ea43-437d-8528-fd360fd4c4f9" (UID: "3a3c4594-ea43-437d-8528-fd360fd4c4f9"). InnerVolumeSpecName "kube-api-access-f5hmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.813669 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" event={"ID":"df387754-5537-4d85-950b-02743c881da8","Type":"ContainerDied","Data":"8495c4433e798d1ac2e79ddf9c88ebf569c3cedf6bdc46579d7bb36aaf2eff72"} Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.813749 4795 scope.go:117] "RemoveContainer" containerID="70af76578d539ade05715b4f45e530828d624afd3981206c0f1b5d67746b15d9" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.813982 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-wgth2" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.823026 4795 generic.go:334] "Generic (PLEG): container finished" podID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerID="b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1" exitCode=0 Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.823072 4795 generic.go:334] "Generic (PLEG): container finished" podID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerID="dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f" exitCode=143 Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.823220 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.823384 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a3c4594-ea43-437d-8528-fd360fd4c4f9","Type":"ContainerDied","Data":"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1"} Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.823446 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a3c4594-ea43-437d-8528-fd360fd4c4f9","Type":"ContainerDied","Data":"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f"} Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.823468 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a3c4594-ea43-437d-8528-fd360fd4c4f9","Type":"ContainerDied","Data":"79584481ab899a406532b67e6a609dc80a7856d39f0219a0cc1743be67cc18e6"} Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.828351 4795 generic.go:334] "Generic (PLEG): container finished" podID="1207388e-c327-4eb4-b81f-dee124375ca8" containerID="2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc" exitCode=143 Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.828435 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1207388e-c327-4eb4-b81f-dee124375ca8","Type":"ContainerDied","Data":"2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc"} Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.843275 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.850072 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-config-data" (OuterVolumeSpecName: "config-data") pod "3a3c4594-ea43-437d-8528-fd360fd4c4f9" (UID: "3a3c4594-ea43-437d-8528-fd360fd4c4f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: W0219 21:48:23.850084 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6fd7841_2a08_4786_8e96_b2ab0f477eff.slice/crio-d9ff2d99d6679f8062f4938a68a22506ca0912d3767e7bc32eff175a9e262c96 WatchSource:0}: Error finding container d9ff2d99d6679f8062f4938a68a22506ca0912d3767e7bc32eff175a9e262c96: Status 404 returned error can't find the container with id d9ff2d99d6679f8062f4938a68a22506ca0912d3767e7bc32eff175a9e262c96 Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.857360 4795 scope.go:117] "RemoveContainer" containerID="93a750f0a2156e5774cb54d38a230aaaa13bdcb55ce20e481877fcd04f00ee2e" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.862311 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-wgth2"] Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.871985 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-wgth2"] Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.880417 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a3c4594-ea43-437d-8528-fd360fd4c4f9" (UID: "3a3c4594-ea43-437d-8528-fd360fd4c4f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.882792 4795 scope.go:117] "RemoveContainer" containerID="b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.891672 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3a3c4594-ea43-437d-8528-fd360fd4c4f9" (UID: "3a3c4594-ea43-437d-8528-fd360fd4c4f9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.907053 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.912467 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.912498 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5hmb\" (UniqueName: \"kubernetes.io/projected/3a3c4594-ea43-437d-8528-fd360fd4c4f9-kube-api-access-f5hmb\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.912510 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3c4594-ea43-437d-8528-fd360fd4c4f9-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.912522 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3c4594-ea43-437d-8528-fd360fd4c4f9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.915856 4795 scope.go:117] "RemoveContainer" containerID="dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.941719 4795 scope.go:117] "RemoveContainer" containerID="b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1" Feb 19 21:48:23 crc kubenswrapper[4795]: E0219 21:48:23.942501 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1\": container with ID starting with b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1 not found: ID does not exist" containerID="b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.942545 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1"} err="failed to get container status \"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1\": rpc error: code = NotFound desc = could not find container \"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1\": container with ID starting with b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1 not found: ID does not exist" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.942573 4795 scope.go:117] "RemoveContainer" containerID="dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f" Feb 19 21:48:23 crc kubenswrapper[4795]: E0219 21:48:23.942967 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f\": container with ID starting with dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f not found: ID does not exist" containerID="dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.942988 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f"} err="failed to get container status \"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f\": rpc error: code = NotFound desc = could not find container \"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f\": container with ID starting with dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f not found: ID does not exist" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.943002 4795 scope.go:117] "RemoveContainer" containerID="b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.943313 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1"} err="failed to get container status \"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1\": rpc error: code = NotFound desc = could not find container \"b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1\": container with ID starting with b31bf1106f4a86916e484b688b272686147f2fac787540528c3da0f773f140d1 not found: ID does not exist" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.943354 4795 scope.go:117] "RemoveContainer" containerID="dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f" Feb 19 21:48:23 crc kubenswrapper[4795]: I0219 21:48:23.943606 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f"} err="failed to get container status \"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f\": rpc error: code = NotFound desc = could not find container \"dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f\": container with ID starting with dd597fc70e647f4bf715223e839702f41d88ff7a121d0f3ca07e6f991bb63a3f not found: ID does not exist" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.201302 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.210419 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.227722 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:24 crc kubenswrapper[4795]: E0219 21:48:24.228140 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerName="nova-metadata-log" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.228159 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerName="nova-metadata-log" Feb 19 21:48:24 crc kubenswrapper[4795]: E0219 21:48:24.228205 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerName="nova-metadata-metadata" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.228213 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerName="nova-metadata-metadata" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.228390 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerName="nova-metadata-log" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.228413 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" containerName="nova-metadata-metadata" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.229342 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.231220 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.241737 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.246066 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.318065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-logs\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.318154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd6zt\" (UniqueName: \"kubernetes.io/projected/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-kube-api-access-fd6zt\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.318298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.318426 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-config-data\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.318536 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.419782 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.419843 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-config-data\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.420053 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.420729 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-logs\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.420793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd6zt\" (UniqueName: \"kubernetes.io/projected/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-kube-api-access-fd6zt\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.421408 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-logs\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.424032 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.432783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.434597 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-config-data\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.442805 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd6zt\" (UniqueName: \"kubernetes.io/projected/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-kube-api-access-fd6zt\") pod \"nova-metadata-0\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.550113 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.840040 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f6fd7841-2a08-4786-8e96-b2ab0f477eff","Type":"ContainerStarted","Data":"4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3"} Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.840389 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.840405 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f6fd7841-2a08-4786-8e96-b2ab0f477eff","Type":"ContainerStarted","Data":"d9ff2d99d6679f8062f4938a68a22506ca0912d3767e7bc32eff175a9e262c96"} Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.846312 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d5aecdca-84a6-4987-8b84-a95fbb0096f9" containerName="nova-scheduler-scheduler" containerID="cri-o://6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6" gracePeriod=30 Feb 19 21:48:24 crc kubenswrapper[4795]: I0219 21:48:24.858468 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.858449587 podStartE2EDuration="2.858449587s" podCreationTimestamp="2026-02-19 21:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:24.854427914 +0000 UTC m=+1216.046945778" watchObservedRunningTime="2026-02-19 21:48:24.858449587 +0000 UTC m=+1216.050967451" Feb 19 21:48:25 crc kubenswrapper[4795]: I0219 21:48:25.046066 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:48:25 crc kubenswrapper[4795]: W0219 21:48:25.056534 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c0bc1cc_7985_4a3f_8ab8_26d49f7706c8.slice/crio-d8263151d90162e0d834534389009f120a5ff2fd0c5a460719918f5dbd83bc95 WatchSource:0}: Error finding container d8263151d90162e0d834534389009f120a5ff2fd0c5a460719918f5dbd83bc95: Status 404 returned error can't find the container with id d8263151d90162e0d834534389009f120a5ff2fd0c5a460719918f5dbd83bc95 Feb 19 21:48:25 crc kubenswrapper[4795]: I0219 21:48:25.526938 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a3c4594-ea43-437d-8528-fd360fd4c4f9" path="/var/lib/kubelet/pods/3a3c4594-ea43-437d-8528-fd360fd4c4f9/volumes" Feb 19 21:48:25 crc kubenswrapper[4795]: I0219 21:48:25.529026 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df387754-5537-4d85-950b-02743c881da8" path="/var/lib/kubelet/pods/df387754-5537-4d85-950b-02743c881da8/volumes" Feb 19 21:48:25 crc kubenswrapper[4795]: I0219 21:48:25.814674 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 21:48:25 crc kubenswrapper[4795]: I0219 21:48:25.866978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8","Type":"ContainerStarted","Data":"a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294"} Feb 19 21:48:25 crc kubenswrapper[4795]: I0219 21:48:25.867044 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8","Type":"ContainerStarted","Data":"fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145"} Feb 19 21:48:25 crc kubenswrapper[4795]: I0219 21:48:25.867056 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8","Type":"ContainerStarted","Data":"d8263151d90162e0d834534389009f120a5ff2fd0c5a460719918f5dbd83bc95"} Feb 19 21:48:25 crc kubenswrapper[4795]: I0219 21:48:25.901039 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.901019319 podStartE2EDuration="1.901019319s" podCreationTimestamp="2026-02-19 21:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:25.898791056 +0000 UTC m=+1217.091308920" watchObservedRunningTime="2026-02-19 21:48:25.901019319 +0000 UTC m=+1217.093537193" Feb 19 21:48:26 crc kubenswrapper[4795]: E0219 21:48:26.960408 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:48:26 crc kubenswrapper[4795]: E0219 21:48:26.961204 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6 is running failed: container process not found" containerID="6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:48:26 crc kubenswrapper[4795]: E0219 21:48:26.961430 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6 is running failed: container process not found" containerID="6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:48:26 crc kubenswrapper[4795]: E0219 21:48:26.961462 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d5aecdca-84a6-4987-8b84-a95fbb0096f9" containerName="nova-scheduler-scheduler" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.351417 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.390611 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9jbt\" (UniqueName: \"kubernetes.io/projected/d5aecdca-84a6-4987-8b84-a95fbb0096f9-kube-api-access-w9jbt\") pod \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.390789 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-config-data\") pod \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.390985 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-combined-ca-bundle\") pod \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\" (UID: \"d5aecdca-84a6-4987-8b84-a95fbb0096f9\") " Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.403119 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5aecdca-84a6-4987-8b84-a95fbb0096f9-kube-api-access-w9jbt" (OuterVolumeSpecName: "kube-api-access-w9jbt") pod "d5aecdca-84a6-4987-8b84-a95fbb0096f9" (UID: "d5aecdca-84a6-4987-8b84-a95fbb0096f9"). InnerVolumeSpecName "kube-api-access-w9jbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.425513 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-config-data" (OuterVolumeSpecName: "config-data") pod "d5aecdca-84a6-4987-8b84-a95fbb0096f9" (UID: "d5aecdca-84a6-4987-8b84-a95fbb0096f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.444546 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5aecdca-84a6-4987-8b84-a95fbb0096f9" (UID: "d5aecdca-84a6-4987-8b84-a95fbb0096f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.493479 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.493515 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9jbt\" (UniqueName: \"kubernetes.io/projected/d5aecdca-84a6-4987-8b84-a95fbb0096f9-kube-api-access-w9jbt\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.493531 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5aecdca-84a6-4987-8b84-a95fbb0096f9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.883800 4795 generic.go:334] "Generic (PLEG): container finished" podID="d5aecdca-84a6-4987-8b84-a95fbb0096f9" containerID="6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6" exitCode=0 Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.883844 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d5aecdca-84a6-4987-8b84-a95fbb0096f9","Type":"ContainerDied","Data":"6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6"} Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.883873 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d5aecdca-84a6-4987-8b84-a95fbb0096f9","Type":"ContainerDied","Data":"13cacc8b3b7ac3be0ed861e620fc89ce3271fa1dd8179aa143ee4013f7bace77"} Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.883893 4795 scope.go:117] "RemoveContainer" containerID="6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.883929 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.903728 4795 scope.go:117] "RemoveContainer" containerID="6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6" Feb 19 21:48:27 crc kubenswrapper[4795]: E0219 21:48:27.904110 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6\": container with ID starting with 6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6 not found: ID does not exist" containerID="6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.904183 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6"} err="failed to get container status \"6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6\": rpc error: code = NotFound desc = could not find container \"6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6\": container with ID starting with 6ce35faeac6fd37dfcca3afbda3833acb5b7e4a468547fda58ae378a06b641f6 not found: ID does not exist" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.929444 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.941604 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.953000 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:27 crc kubenswrapper[4795]: E0219 21:48:27.953660 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5aecdca-84a6-4987-8b84-a95fbb0096f9" containerName="nova-scheduler-scheduler" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.953694 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5aecdca-84a6-4987-8b84-a95fbb0096f9" containerName="nova-scheduler-scheduler" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.954023 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5aecdca-84a6-4987-8b84-a95fbb0096f9" containerName="nova-scheduler-scheduler" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.954913 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.956940 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 21:48:27 crc kubenswrapper[4795]: I0219 21:48:27.962937 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.004192 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.004257 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-config-data\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.004603 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tmq5\" (UniqueName: \"kubernetes.io/projected/c0f49585-0601-424b-9f28-304ae06c9d93-kube-api-access-8tmq5\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.105992 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tmq5\" (UniqueName: \"kubernetes.io/projected/c0f49585-0601-424b-9f28-304ae06c9d93-kube-api-access-8tmq5\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.106071 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.106129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-config-data\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.111933 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-config-data\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.123357 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.126911 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tmq5\" (UniqueName: \"kubernetes.io/projected/c0f49585-0601-424b-9f28-304ae06c9d93-kube-api-access-8tmq5\") pod \"nova-scheduler-0\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.271945 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.736771 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.863424 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.897584 4795 generic.go:334] "Generic (PLEG): container finished" podID="1207388e-c327-4eb4-b81f-dee124375ca8" containerID="70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117" exitCode=0 Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.897650 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1207388e-c327-4eb4-b81f-dee124375ca8","Type":"ContainerDied","Data":"70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117"} Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.897673 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1207388e-c327-4eb4-b81f-dee124375ca8","Type":"ContainerDied","Data":"70232baf2a46181b0fb51eefd50334fc1763134daec5b9978cd7cc19312a07a8"} Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.897693 4795 scope.go:117] "RemoveContainer" containerID="70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.897791 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.901959 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0f49585-0601-424b-9f28-304ae06c9d93","Type":"ContainerStarted","Data":"a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5"} Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.902228 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0f49585-0601-424b-9f28-304ae06c9d93","Type":"ContainerStarted","Data":"cd865b730fa4ce805f52ae67f6b00c0275c97a6018c4d5724e64d28e7cd4b5db"} Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.920891 4795 scope.go:117] "RemoveContainer" containerID="2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.923302 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.9232872300000001 podStartE2EDuration="1.92328723s" podCreationTimestamp="2026-02-19 21:48:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:28.920581364 +0000 UTC m=+1220.113099218" watchObservedRunningTime="2026-02-19 21:48:28.92328723 +0000 UTC m=+1220.115805094" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.946516 4795 scope.go:117] "RemoveContainer" containerID="70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117" Feb 19 21:48:28 crc kubenswrapper[4795]: E0219 21:48:28.946991 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117\": container with ID starting with 70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117 not found: ID does not exist" containerID="70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.947045 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117"} err="failed to get container status \"70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117\": rpc error: code = NotFound desc = could not find container \"70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117\": container with ID starting with 70ad9e55a15e5820942ade2e6910d9f4bda067bc07e673f00b4a59d72193b117 not found: ID does not exist" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.947071 4795 scope.go:117] "RemoveContainer" containerID="2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc" Feb 19 21:48:28 crc kubenswrapper[4795]: E0219 21:48:28.947462 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc\": container with ID starting with 2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc not found: ID does not exist" containerID="2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc" Feb 19 21:48:28 crc kubenswrapper[4795]: I0219 21:48:28.947504 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc"} err="failed to get container status \"2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc\": rpc error: code = NotFound desc = could not find container \"2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc\": container with ID starting with 2918358fad24b90cece507f40bf90974e931a773750ddc328657552d135bcffc not found: ID does not exist" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.024493 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v6ct\" (UniqueName: \"kubernetes.io/projected/1207388e-c327-4eb4-b81f-dee124375ca8-kube-api-access-8v6ct\") pod \"1207388e-c327-4eb4-b81f-dee124375ca8\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.024551 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-combined-ca-bundle\") pod \"1207388e-c327-4eb4-b81f-dee124375ca8\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.024769 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1207388e-c327-4eb4-b81f-dee124375ca8-logs\") pod \"1207388e-c327-4eb4-b81f-dee124375ca8\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.024796 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-config-data\") pod \"1207388e-c327-4eb4-b81f-dee124375ca8\" (UID: \"1207388e-c327-4eb4-b81f-dee124375ca8\") " Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.025521 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1207388e-c327-4eb4-b81f-dee124375ca8-logs" (OuterVolumeSpecName: "logs") pod "1207388e-c327-4eb4-b81f-dee124375ca8" (UID: "1207388e-c327-4eb4-b81f-dee124375ca8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.029508 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1207388e-c327-4eb4-b81f-dee124375ca8-kube-api-access-8v6ct" (OuterVolumeSpecName: "kube-api-access-8v6ct") pod "1207388e-c327-4eb4-b81f-dee124375ca8" (UID: "1207388e-c327-4eb4-b81f-dee124375ca8"). InnerVolumeSpecName "kube-api-access-8v6ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.053471 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-config-data" (OuterVolumeSpecName: "config-data") pod "1207388e-c327-4eb4-b81f-dee124375ca8" (UID: "1207388e-c327-4eb4-b81f-dee124375ca8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.063222 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1207388e-c327-4eb4-b81f-dee124375ca8" (UID: "1207388e-c327-4eb4-b81f-dee124375ca8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.126774 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1207388e-c327-4eb4-b81f-dee124375ca8-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.126810 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.126821 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v6ct\" (UniqueName: \"kubernetes.io/projected/1207388e-c327-4eb4-b81f-dee124375ca8-kube-api-access-8v6ct\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.126830 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1207388e-c327-4eb4-b81f-dee124375ca8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.230661 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.249749 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.256865 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:29 crc kubenswrapper[4795]: E0219 21:48:29.257284 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-api" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.257303 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-api" Feb 19 21:48:29 crc kubenswrapper[4795]: E0219 21:48:29.257326 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-log" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.257335 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-log" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.257519 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-api" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.257551 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" containerName="nova-api-log" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.258591 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.266764 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.281197 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.432157 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01859e80-9d51-4db2-8a48-9ad45d901f16-logs\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.432232 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9krq\" (UniqueName: \"kubernetes.io/projected/01859e80-9d51-4db2-8a48-9ad45d901f16-kube-api-access-s9krq\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.432296 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.432367 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-config-data\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.521771 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1207388e-c327-4eb4-b81f-dee124375ca8" path="/var/lib/kubelet/pods/1207388e-c327-4eb4-b81f-dee124375ca8/volumes" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.522351 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5aecdca-84a6-4987-8b84-a95fbb0096f9" path="/var/lib/kubelet/pods/d5aecdca-84a6-4987-8b84-a95fbb0096f9/volumes" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.522829 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.522978 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="bb3b374e-f01b-4997-9ecf-fbeeb384cc2c" containerName="kube-state-metrics" containerID="cri-o://0c46b510d414f62d57f7afe61292d3a54a60ed7655ea208a609c0d72a9940824" gracePeriod=30 Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.535204 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01859e80-9d51-4db2-8a48-9ad45d901f16-logs\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.535256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9krq\" (UniqueName: \"kubernetes.io/projected/01859e80-9d51-4db2-8a48-9ad45d901f16-kube-api-access-s9krq\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.535318 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.535380 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-config-data\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.535798 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01859e80-9d51-4db2-8a48-9ad45d901f16-logs\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.540442 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-config-data\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.542823 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.551076 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.551406 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.552261 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9krq\" (UniqueName: \"kubernetes.io/projected/01859e80-9d51-4db2-8a48-9ad45d901f16-kube-api-access-s9krq\") pod \"nova-api-0\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.578746 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.911727 4795 generic.go:334] "Generic (PLEG): container finished" podID="bb3b374e-f01b-4997-9ecf-fbeeb384cc2c" containerID="0c46b510d414f62d57f7afe61292d3a54a60ed7655ea208a609c0d72a9940824" exitCode=2 Feb 19 21:48:29 crc kubenswrapper[4795]: I0219 21:48:29.911893 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c","Type":"ContainerDied","Data":"0c46b510d414f62d57f7afe61292d3a54a60ed7655ea208a609c0d72a9940824"} Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.000476 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.047027 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.145689 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr8kk\" (UniqueName: \"kubernetes.io/projected/bb3b374e-f01b-4997-9ecf-fbeeb384cc2c-kube-api-access-qr8kk\") pod \"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c\" (UID: \"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c\") " Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.150800 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb3b374e-f01b-4997-9ecf-fbeeb384cc2c-kube-api-access-qr8kk" (OuterVolumeSpecName: "kube-api-access-qr8kk") pod "bb3b374e-f01b-4997-9ecf-fbeeb384cc2c" (UID: "bb3b374e-f01b-4997-9ecf-fbeeb384cc2c"). InnerVolumeSpecName "kube-api-access-qr8kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.247952 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr8kk\" (UniqueName: \"kubernetes.io/projected/bb3b374e-f01b-4997-9ecf-fbeeb384cc2c-kube-api-access-qr8kk\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.930935 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01859e80-9d51-4db2-8a48-9ad45d901f16","Type":"ContainerStarted","Data":"a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59"} Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.931474 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01859e80-9d51-4db2-8a48-9ad45d901f16","Type":"ContainerStarted","Data":"cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94"} Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.931500 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01859e80-9d51-4db2-8a48-9ad45d901f16","Type":"ContainerStarted","Data":"4d23ebd426731022b2890eb99d72e3c34bf0eabb128346170d4b9e6c3457a311"} Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.933254 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bb3b374e-f01b-4997-9ecf-fbeeb384cc2c","Type":"ContainerDied","Data":"1c52434379a6a736b3ede83a285c9c03d25be4bf31ea4977955cffc137750ca3"} Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.933306 4795 scope.go:117] "RemoveContainer" containerID="0c46b510d414f62d57f7afe61292d3a54a60ed7655ea208a609c0d72a9940824" Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.933321 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.971499 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.971478244 podStartE2EDuration="1.971478244s" podCreationTimestamp="2026-02-19 21:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:30.958123439 +0000 UTC m=+1222.150641303" watchObservedRunningTime="2026-02-19 21:48:30.971478244 +0000 UTC m=+1222.163996128" Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.983873 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:48:30 crc kubenswrapper[4795]: I0219 21:48:30.994129 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.003110 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:48:31 crc kubenswrapper[4795]: E0219 21:48:31.006097 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3b374e-f01b-4997-9ecf-fbeeb384cc2c" containerName="kube-state-metrics" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.006191 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3b374e-f01b-4997-9ecf-fbeeb384cc2c" containerName="kube-state-metrics" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.006477 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3b374e-f01b-4997-9ecf-fbeeb384cc2c" containerName="kube-state-metrics" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.007182 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.009702 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.010451 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.019812 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.165128 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.165374 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.165471 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.165755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlmk9\" (UniqueName: \"kubernetes.io/projected/296f6b57-de45-495d-abe9-8c779c157057-kube-api-access-zlmk9\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.211876 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.212664 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="ceilometer-central-agent" containerID="cri-o://8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b" gracePeriod=30 Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.213021 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="proxy-httpd" containerID="cri-o://f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b" gracePeriod=30 Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.213260 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="ceilometer-notification-agent" containerID="cri-o://1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8" gracePeriod=30 Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.213322 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="sg-core" containerID="cri-o://bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da" gracePeriod=30 Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.267711 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.267790 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.267889 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlmk9\" (UniqueName: \"kubernetes.io/projected/296f6b57-de45-495d-abe9-8c779c157057-kube-api-access-zlmk9\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.267960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.273095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.279278 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.279681 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.283395 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlmk9\" (UniqueName: \"kubernetes.io/projected/296f6b57-de45-495d-abe9-8c779c157057-kube-api-access-zlmk9\") pod \"kube-state-metrics-0\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.337914 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.535485 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb3b374e-f01b-4997-9ecf-fbeeb384cc2c" path="/var/lib/kubelet/pods/bb3b374e-f01b-4997-9ecf-fbeeb384cc2c/volumes" Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.800231 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:48:31 crc kubenswrapper[4795]: W0219 21:48:31.801625 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod296f6b57_de45_495d_abe9_8c779c157057.slice/crio-b8dfa3bc80470864a239275349fad5e129ee2b5e9ff5a093105f56a1aaa790d3 WatchSource:0}: Error finding container b8dfa3bc80470864a239275349fad5e129ee2b5e9ff5a093105f56a1aaa790d3: Status 404 returned error can't find the container with id b8dfa3bc80470864a239275349fad5e129ee2b5e9ff5a093105f56a1aaa790d3 Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.946861 4795 generic.go:334] "Generic (PLEG): container finished" podID="60557b75-bafe-4e92-937c-e541b84aaf70" containerID="f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b" exitCode=0 Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.947096 4795 generic.go:334] "Generic (PLEG): container finished" podID="60557b75-bafe-4e92-937c-e541b84aaf70" containerID="bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da" exitCode=2 Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.947199 4795 generic.go:334] "Generic (PLEG): container finished" podID="60557b75-bafe-4e92-937c-e541b84aaf70" containerID="8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b" exitCode=0 Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.946962 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerDied","Data":"f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b"} Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.947421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerDied","Data":"bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da"} Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.947444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerDied","Data":"8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b"} Feb 19 21:48:31 crc kubenswrapper[4795]: I0219 21:48:31.951578 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"296f6b57-de45-495d-abe9-8c779c157057","Type":"ContainerStarted","Data":"b8dfa3bc80470864a239275349fad5e129ee2b5e9ff5a093105f56a1aaa790d3"} Feb 19 21:48:32 crc kubenswrapper[4795]: I0219 21:48:32.961422 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"296f6b57-de45-495d-abe9-8c779c157057","Type":"ContainerStarted","Data":"5a4765254bd1f522ba57b50f9f6989619b42ec50e10b34373400dceff9cc29f7"} Feb 19 21:48:32 crc kubenswrapper[4795]: I0219 21:48:32.961854 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 21:48:32 crc kubenswrapper[4795]: I0219 21:48:32.990229 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.537163662 podStartE2EDuration="2.99020322s" podCreationTimestamp="2026-02-19 21:48:30 +0000 UTC" firstStartedPulling="2026-02-19 21:48:31.803762107 +0000 UTC m=+1222.996279971" lastFinishedPulling="2026-02-19 21:48:32.256801665 +0000 UTC m=+1223.449319529" observedRunningTime="2026-02-19 21:48:32.977230756 +0000 UTC m=+1224.169748650" watchObservedRunningTime="2026-02-19 21:48:32.99020322 +0000 UTC m=+1224.182721124" Feb 19 21:48:33 crc kubenswrapper[4795]: I0219 21:48:33.265302 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 21:48:33 crc kubenswrapper[4795]: I0219 21:48:33.273106 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.396543 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.534985 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-combined-ca-bundle\") pod \"60557b75-bafe-4e92-937c-e541b84aaf70\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.535036 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-scripts\") pod \"60557b75-bafe-4e92-937c-e541b84aaf70\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.535072 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzxr6\" (UniqueName: \"kubernetes.io/projected/60557b75-bafe-4e92-937c-e541b84aaf70-kube-api-access-gzxr6\") pod \"60557b75-bafe-4e92-937c-e541b84aaf70\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.535098 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-config-data\") pod \"60557b75-bafe-4e92-937c-e541b84aaf70\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.535141 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-run-httpd\") pod \"60557b75-bafe-4e92-937c-e541b84aaf70\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.535225 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-log-httpd\") pod \"60557b75-bafe-4e92-937c-e541b84aaf70\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.536181 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-sg-core-conf-yaml\") pod \"60557b75-bafe-4e92-937c-e541b84aaf70\" (UID: \"60557b75-bafe-4e92-937c-e541b84aaf70\") " Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.537558 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "60557b75-bafe-4e92-937c-e541b84aaf70" (UID: "60557b75-bafe-4e92-937c-e541b84aaf70"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.537974 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "60557b75-bafe-4e92-937c-e541b84aaf70" (UID: "60557b75-bafe-4e92-937c-e541b84aaf70"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.541947 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60557b75-bafe-4e92-937c-e541b84aaf70-kube-api-access-gzxr6" (OuterVolumeSpecName: "kube-api-access-gzxr6") pod "60557b75-bafe-4e92-937c-e541b84aaf70" (UID: "60557b75-bafe-4e92-937c-e541b84aaf70"). InnerVolumeSpecName "kube-api-access-gzxr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.542080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-scripts" (OuterVolumeSpecName: "scripts") pod "60557b75-bafe-4e92-937c-e541b84aaf70" (UID: "60557b75-bafe-4e92-937c-e541b84aaf70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.551666 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.551697 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.584320 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "60557b75-bafe-4e92-937c-e541b84aaf70" (UID: "60557b75-bafe-4e92-937c-e541b84aaf70"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.636054 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60557b75-bafe-4e92-937c-e541b84aaf70" (UID: "60557b75-bafe-4e92-937c-e541b84aaf70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.637867 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.637897 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.637907 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.637916 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.637925 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzxr6\" (UniqueName: \"kubernetes.io/projected/60557b75-bafe-4e92-937c-e541b84aaf70-kube-api-access-gzxr6\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.637933 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60557b75-bafe-4e92-937c-e541b84aaf70-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.651546 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-config-data" (OuterVolumeSpecName: "config-data") pod "60557b75-bafe-4e92-937c-e541b84aaf70" (UID: "60557b75-bafe-4e92-937c-e541b84aaf70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.743386 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60557b75-bafe-4e92-937c-e541b84aaf70-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.992946 4795 generic.go:334] "Generic (PLEG): container finished" podID="60557b75-bafe-4e92-937c-e541b84aaf70" containerID="1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8" exitCode=0 Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.992984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerDied","Data":"1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8"} Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.993010 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60557b75-bafe-4e92-937c-e541b84aaf70","Type":"ContainerDied","Data":"2512790f4175863e7da7d55dc8c6ebb57bbf253fa486687fa567e90d3c41dca6"} Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.993026 4795 scope.go:117] "RemoveContainer" containerID="f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b" Feb 19 21:48:34 crc kubenswrapper[4795]: I0219 21:48:34.993146 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.054338 4795 scope.go:117] "RemoveContainer" containerID="bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.072064 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.085286 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.094791 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:35 crc kubenswrapper[4795]: E0219 21:48:35.095270 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="ceilometer-central-agent" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.095293 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="ceilometer-central-agent" Feb 19 21:48:35 crc kubenswrapper[4795]: E0219 21:48:35.095311 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="sg-core" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.095319 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="sg-core" Feb 19 21:48:35 crc kubenswrapper[4795]: E0219 21:48:35.095339 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="proxy-httpd" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.095346 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="proxy-httpd" Feb 19 21:48:35 crc kubenswrapper[4795]: E0219 21:48:35.095366 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="ceilometer-notification-agent" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.095373 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="ceilometer-notification-agent" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.095556 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="ceilometer-notification-agent" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.095574 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="sg-core" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.095586 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="proxy-httpd" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.095599 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" containerName="ceilometer-central-agent" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.097406 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.099205 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.099551 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.099890 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.104808 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.127399 4795 scope.go:117] "RemoveContainer" containerID="1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.148422 4795 scope.go:117] "RemoveContainer" containerID="8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.171257 4795 scope.go:117] "RemoveContainer" containerID="f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b" Feb 19 21:48:35 crc kubenswrapper[4795]: E0219 21:48:35.171775 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b\": container with ID starting with f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b not found: ID does not exist" containerID="f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.171892 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b"} err="failed to get container status \"f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b\": rpc error: code = NotFound desc = could not find container \"f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b\": container with ID starting with f916e17382481e444593150e0a833b6a6814157585865c277cdc15cc441ddb4b not found: ID does not exist" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.171982 4795 scope.go:117] "RemoveContainer" containerID="bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da" Feb 19 21:48:35 crc kubenswrapper[4795]: E0219 21:48:35.172403 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da\": container with ID starting with bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da not found: ID does not exist" containerID="bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.172424 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da"} err="failed to get container status \"bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da\": rpc error: code = NotFound desc = could not find container \"bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da\": container with ID starting with bd25350fb77f428e282ee6c0811cc7be2f53b67333596fcfe27b4e218e3549da not found: ID does not exist" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.172438 4795 scope.go:117] "RemoveContainer" containerID="1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8" Feb 19 21:48:35 crc kubenswrapper[4795]: E0219 21:48:35.172668 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8\": container with ID starting with 1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8 not found: ID does not exist" containerID="1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.172748 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8"} err="failed to get container status \"1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8\": rpc error: code = NotFound desc = could not find container \"1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8\": container with ID starting with 1631e868548b8541b4586b49303d99f204a0c9cbd4601bd40535a084257996e8 not found: ID does not exist" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.172813 4795 scope.go:117] "RemoveContainer" containerID="8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b" Feb 19 21:48:35 crc kubenswrapper[4795]: E0219 21:48:35.173086 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b\": container with ID starting with 8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b not found: ID does not exist" containerID="8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.173157 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b"} err="failed to get container status \"8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b\": rpc error: code = NotFound desc = could not find container \"8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b\": container with ID starting with 8985ce1aa24fbc5f5bb8026bb427b370dba124cba19d3928ff7c3a92ca42615b not found: ID does not exist" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.253441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.253737 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-log-httpd\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.253875 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd9mf\" (UniqueName: \"kubernetes.io/projected/621537d8-aeb6-42fa-842d-fb45f36c97f6-kube-api-access-wd9mf\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.253953 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.254045 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-config-data\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.254139 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-run-httpd\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.254238 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.254410 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-scripts\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356040 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-log-httpd\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd9mf\" (UniqueName: \"kubernetes.io/projected/621537d8-aeb6-42fa-842d-fb45f36c97f6-kube-api-access-wd9mf\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356158 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-config-data\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356219 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-run-httpd\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356243 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356352 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-scripts\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356410 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.356647 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-log-httpd\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.357217 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-run-httpd\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.362006 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.362733 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-scripts\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.363195 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.364701 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.372204 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-config-data\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.377568 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd9mf\" (UniqueName: \"kubernetes.io/projected/621537d8-aeb6-42fa-842d-fb45f36c97f6-kube-api-access-wd9mf\") pod \"ceilometer-0\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.431723 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.521936 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60557b75-bafe-4e92-937c-e541b84aaf70" path="/var/lib/kubelet/pods/60557b75-bafe-4e92-937c-e541b84aaf70/volumes" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.565283 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.565908 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:48:35 crc kubenswrapper[4795]: W0219 21:48:35.871227 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod621537d8_aeb6_42fa_842d_fb45f36c97f6.slice/crio-ac61b3ede19f57aaca2d6d35d68c23fd53c055f4e5a6557a9c61b51dfdcd0f58 WatchSource:0}: Error finding container ac61b3ede19f57aaca2d6d35d68c23fd53c055f4e5a6557a9c61b51dfdcd0f58: Status 404 returned error can't find the container with id ac61b3ede19f57aaca2d6d35d68c23fd53c055f4e5a6557a9c61b51dfdcd0f58 Feb 19 21:48:35 crc kubenswrapper[4795]: I0219 21:48:35.874041 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:36 crc kubenswrapper[4795]: I0219 21:48:36.006718 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerStarted","Data":"ac61b3ede19f57aaca2d6d35d68c23fd53c055f4e5a6557a9c61b51dfdcd0f58"} Feb 19 21:48:37 crc kubenswrapper[4795]: I0219 21:48:37.019606 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerStarted","Data":"b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d"} Feb 19 21:48:38 crc kubenswrapper[4795]: I0219 21:48:38.032101 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerStarted","Data":"8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3"} Feb 19 21:48:38 crc kubenswrapper[4795]: I0219 21:48:38.272936 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 21:48:38 crc kubenswrapper[4795]: I0219 21:48:38.303180 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 21:48:39 crc kubenswrapper[4795]: I0219 21:48:39.047880 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerStarted","Data":"a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b"} Feb 19 21:48:39 crc kubenswrapper[4795]: I0219 21:48:39.077031 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 21:48:39 crc kubenswrapper[4795]: I0219 21:48:39.579513 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:48:39 crc kubenswrapper[4795]: I0219 21:48:39.579987 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:48:40 crc kubenswrapper[4795]: I0219 21:48:40.620422 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 21:48:40 crc kubenswrapper[4795]: I0219 21:48:40.620456 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 21:48:41 crc kubenswrapper[4795]: I0219 21:48:41.079121 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerStarted","Data":"be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a"} Feb 19 21:48:41 crc kubenswrapper[4795]: I0219 21:48:41.079423 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:48:41 crc kubenswrapper[4795]: I0219 21:48:41.126077 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.095338536 podStartE2EDuration="6.126052119s" podCreationTimestamp="2026-02-19 21:48:35 +0000 UTC" firstStartedPulling="2026-02-19 21:48:35.875696669 +0000 UTC m=+1227.068214553" lastFinishedPulling="2026-02-19 21:48:39.906410262 +0000 UTC m=+1231.098928136" observedRunningTime="2026-02-19 21:48:41.108153936 +0000 UTC m=+1232.300671860" watchObservedRunningTime="2026-02-19 21:48:41.126052119 +0000 UTC m=+1232.318570003" Feb 19 21:48:41 crc kubenswrapper[4795]: I0219 21:48:41.353465 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 21:48:44 crc kubenswrapper[4795]: I0219 21:48:44.557202 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 21:48:44 crc kubenswrapper[4795]: I0219 21:48:44.559657 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 21:48:44 crc kubenswrapper[4795]: I0219 21:48:44.562588 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 21:48:45 crc kubenswrapper[4795]: I0219 21:48:45.140514 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.058974 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.154928 4795 generic.go:334] "Generic (PLEG): container finished" podID="4dabff2c-427b-4307-b949-23fdde980292" containerID="0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45" exitCode=137 Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.154968 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4dabff2c-427b-4307-b949-23fdde980292","Type":"ContainerDied","Data":"0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45"} Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.155000 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.155014 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4dabff2c-427b-4307-b949-23fdde980292","Type":"ContainerDied","Data":"7a24e1fe5ec311d17f9a97363d6465ddfdef8ead02652c4242800fc85c6ff620"} Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.155034 4795 scope.go:117] "RemoveContainer" containerID="0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.180658 4795 scope.go:117] "RemoveContainer" containerID="0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45" Feb 19 21:48:47 crc kubenswrapper[4795]: E0219 21:48:47.181085 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45\": container with ID starting with 0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45 not found: ID does not exist" containerID="0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.181130 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45"} err="failed to get container status \"0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45\": rpc error: code = NotFound desc = could not find container \"0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45\": container with ID starting with 0985f63802fd352b3aee39c14704d73368fdf114b51ac51270903568f12ddc45 not found: ID does not exist" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.186801 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29qhd\" (UniqueName: \"kubernetes.io/projected/4dabff2c-427b-4307-b949-23fdde980292-kube-api-access-29qhd\") pod \"4dabff2c-427b-4307-b949-23fdde980292\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.187015 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-combined-ca-bundle\") pod \"4dabff2c-427b-4307-b949-23fdde980292\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.187173 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-config-data\") pod \"4dabff2c-427b-4307-b949-23fdde980292\" (UID: \"4dabff2c-427b-4307-b949-23fdde980292\") " Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.194927 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dabff2c-427b-4307-b949-23fdde980292-kube-api-access-29qhd" (OuterVolumeSpecName: "kube-api-access-29qhd") pod "4dabff2c-427b-4307-b949-23fdde980292" (UID: "4dabff2c-427b-4307-b949-23fdde980292"). InnerVolumeSpecName "kube-api-access-29qhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.216431 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dabff2c-427b-4307-b949-23fdde980292" (UID: "4dabff2c-427b-4307-b949-23fdde980292"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.232068 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-config-data" (OuterVolumeSpecName: "config-data") pod "4dabff2c-427b-4307-b949-23fdde980292" (UID: "4dabff2c-427b-4307-b949-23fdde980292"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.289457 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.289504 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dabff2c-427b-4307-b949-23fdde980292-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.289522 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29qhd\" (UniqueName: \"kubernetes.io/projected/4dabff2c-427b-4307-b949-23fdde980292-kube-api-access-29qhd\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.540557 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.540603 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.552271 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:47 crc kubenswrapper[4795]: E0219 21:48:47.552781 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dabff2c-427b-4307-b949-23fdde980292" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.552805 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dabff2c-427b-4307-b949-23fdde980292" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.553068 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dabff2c-427b-4307-b949-23fdde980292" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.553834 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.557684 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.564828 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.564996 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.579689 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.696770 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.697177 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz5kj\" (UniqueName: \"kubernetes.io/projected/0adadcd9-8949-443b-8042-d0d11191eae9-kube-api-access-cz5kj\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.697228 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.697285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.697316 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.799361 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.799411 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz5kj\" (UniqueName: \"kubernetes.io/projected/0adadcd9-8949-443b-8042-d0d11191eae9-kube-api-access-cz5kj\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.799447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.799488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.799510 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.803034 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.803382 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.803481 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.803438 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.821229 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz5kj\" (UniqueName: \"kubernetes.io/projected/0adadcd9-8949-443b-8042-d0d11191eae9-kube-api-access-cz5kj\") pod \"nova-cell1-novncproxy-0\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:47 crc kubenswrapper[4795]: I0219 21:48:47.904256 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:48 crc kubenswrapper[4795]: W0219 21:48:48.362346 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0adadcd9_8949_443b_8042_d0d11191eae9.slice/crio-6555271b3d98ead4145953a8b269ef0cca6677e13c5b918d43b6b24ff869130e WatchSource:0}: Error finding container 6555271b3d98ead4145953a8b269ef0cca6677e13c5b918d43b6b24ff869130e: Status 404 returned error can't find the container with id 6555271b3d98ead4145953a8b269ef0cca6677e13c5b918d43b6b24ff869130e Feb 19 21:48:48 crc kubenswrapper[4795]: I0219 21:48:48.369750 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:48:49 crc kubenswrapper[4795]: I0219 21:48:49.182778 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0adadcd9-8949-443b-8042-d0d11191eae9","Type":"ContainerStarted","Data":"0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f"} Feb 19 21:48:49 crc kubenswrapper[4795]: I0219 21:48:49.183190 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0adadcd9-8949-443b-8042-d0d11191eae9","Type":"ContainerStarted","Data":"6555271b3d98ead4145953a8b269ef0cca6677e13c5b918d43b6b24ff869130e"} Feb 19 21:48:49 crc kubenswrapper[4795]: I0219 21:48:49.221427 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.221396907 podStartE2EDuration="2.221396907s" podCreationTimestamp="2026-02-19 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:49.202346532 +0000 UTC m=+1240.394864396" watchObservedRunningTime="2026-02-19 21:48:49.221396907 +0000 UTC m=+1240.413914821" Feb 19 21:48:49 crc kubenswrapper[4795]: I0219 21:48:49.532265 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dabff2c-427b-4307-b949-23fdde980292" path="/var/lib/kubelet/pods/4dabff2c-427b-4307-b949-23fdde980292/volumes" Feb 19 21:48:49 crc kubenswrapper[4795]: I0219 21:48:49.590583 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 21:48:49 crc kubenswrapper[4795]: I0219 21:48:49.592609 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 21:48:49 crc kubenswrapper[4795]: I0219 21:48:49.592747 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 21:48:49 crc kubenswrapper[4795]: I0219 21:48:49.607698 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.193341 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.199999 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.401946 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7677694455-vk4hv"] Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.403504 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.432843 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7677694455-vk4hv"] Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.556466 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-nb\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.556521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-sb\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.556596 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-svc\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.556668 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8xm5\" (UniqueName: \"kubernetes.io/projected/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-kube-api-access-s8xm5\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.556713 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-swift-storage-0\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.556752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-config\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.658091 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-nb\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.658149 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-sb\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.658266 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-svc\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.658345 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8xm5\" (UniqueName: \"kubernetes.io/projected/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-kube-api-access-s8xm5\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.658417 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-swift-storage-0\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.658453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-config\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.660224 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-nb\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.660827 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-sb\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.661392 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-swift-storage-0\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.661771 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-config\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.662320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-svc\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.682142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8xm5\" (UniqueName: \"kubernetes.io/projected/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-kube-api-access-s8xm5\") pod \"dnsmasq-dns-7677694455-vk4hv\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:50 crc kubenswrapper[4795]: I0219 21:48:50.731896 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:51 crc kubenswrapper[4795]: I0219 21:48:51.191565 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7677694455-vk4hv"] Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.092761 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.093434 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="ceilometer-central-agent" containerID="cri-o://b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d" gracePeriod=30 Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.093530 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="ceilometer-notification-agent" containerID="cri-o://8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3" gracePeriod=30 Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.093533 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="sg-core" containerID="cri-o://a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b" gracePeriod=30 Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.093561 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="proxy-httpd" containerID="cri-o://be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a" gracePeriod=30 Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.099918 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.199:3000/\": read tcp 10.217.0.2:55466->10.217.0.199:3000: read: connection reset by peer" Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.210509 4795 generic.go:334] "Generic (PLEG): container finished" podID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerID="445b39298e77e683eee2d951a286aa4f9de79beb92f370dca4d81f0dfe56d255" exitCode=0 Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.210585 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-vk4hv" event={"ID":"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190","Type":"ContainerDied","Data":"445b39298e77e683eee2d951a286aa4f9de79beb92f370dca4d81f0dfe56d255"} Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.210638 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-vk4hv" event={"ID":"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190","Type":"ContainerStarted","Data":"cdc18778f810815fa368ef0cc45dbb0e103ffbfcfa9231a83aa5be2dd6cfe1c2"} Feb 19 21:48:52 crc kubenswrapper[4795]: I0219 21:48:52.905153 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.019764 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.221916 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-vk4hv" event={"ID":"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190","Type":"ContainerStarted","Data":"0b266e887f3cb16fe191b1e64cc8d8adf7464d874863c315ba4605ce8d79b678"} Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.222084 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.224262 4795 generic.go:334] "Generic (PLEG): container finished" podID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerID="be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a" exitCode=0 Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.224283 4795 generic.go:334] "Generic (PLEG): container finished" podID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerID="a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b" exitCode=2 Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.224290 4795 generic.go:334] "Generic (PLEG): container finished" podID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerID="b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d" exitCode=0 Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.224504 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-log" containerID="cri-o://cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94" gracePeriod=30 Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.224600 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerDied","Data":"be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a"} Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.224622 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerDied","Data":"a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b"} Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.224649 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerDied","Data":"b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d"} Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.224696 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-api" containerID="cri-o://a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59" gracePeriod=30 Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.242252 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7677694455-vk4hv" podStartSLOduration=3.242234383 podStartE2EDuration="3.242234383s" podCreationTimestamp="2026-02-19 21:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:53.240994939 +0000 UTC m=+1244.433512803" watchObservedRunningTime="2026-02-19 21:48:53.242234383 +0000 UTC m=+1244.434752257" Feb 19 21:48:53 crc kubenswrapper[4795]: I0219 21:48:53.981153 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132272 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-combined-ca-bundle\") pod \"621537d8-aeb6-42fa-842d-fb45f36c97f6\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132340 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-ceilometer-tls-certs\") pod \"621537d8-aeb6-42fa-842d-fb45f36c97f6\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132366 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-scripts\") pod \"621537d8-aeb6-42fa-842d-fb45f36c97f6\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132419 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-log-httpd\") pod \"621537d8-aeb6-42fa-842d-fb45f36c97f6\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132456 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-run-httpd\") pod \"621537d8-aeb6-42fa-842d-fb45f36c97f6\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132506 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-config-data\") pod \"621537d8-aeb6-42fa-842d-fb45f36c97f6\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132569 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-sg-core-conf-yaml\") pod \"621537d8-aeb6-42fa-842d-fb45f36c97f6\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132631 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd9mf\" (UniqueName: \"kubernetes.io/projected/621537d8-aeb6-42fa-842d-fb45f36c97f6-kube-api-access-wd9mf\") pod \"621537d8-aeb6-42fa-842d-fb45f36c97f6\" (UID: \"621537d8-aeb6-42fa-842d-fb45f36c97f6\") " Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.132917 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "621537d8-aeb6-42fa-842d-fb45f36c97f6" (UID: "621537d8-aeb6-42fa-842d-fb45f36c97f6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.133019 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "621537d8-aeb6-42fa-842d-fb45f36c97f6" (UID: "621537d8-aeb6-42fa-842d-fb45f36c97f6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.133054 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.138438 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-scripts" (OuterVolumeSpecName: "scripts") pod "621537d8-aeb6-42fa-842d-fb45f36c97f6" (UID: "621537d8-aeb6-42fa-842d-fb45f36c97f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.151379 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/621537d8-aeb6-42fa-842d-fb45f36c97f6-kube-api-access-wd9mf" (OuterVolumeSpecName: "kube-api-access-wd9mf") pod "621537d8-aeb6-42fa-842d-fb45f36c97f6" (UID: "621537d8-aeb6-42fa-842d-fb45f36c97f6"). InnerVolumeSpecName "kube-api-access-wd9mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.169541 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "621537d8-aeb6-42fa-842d-fb45f36c97f6" (UID: "621537d8-aeb6-42fa-842d-fb45f36c97f6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.198457 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "621537d8-aeb6-42fa-842d-fb45f36c97f6" (UID: "621537d8-aeb6-42fa-842d-fb45f36c97f6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.224655 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-config-data" (OuterVolumeSpecName: "config-data") pod "621537d8-aeb6-42fa-842d-fb45f36c97f6" (UID: "621537d8-aeb6-42fa-842d-fb45f36c97f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.233914 4795 generic.go:334] "Generic (PLEG): container finished" podID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerID="cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94" exitCode=143 Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.233993 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01859e80-9d51-4db2-8a48-9ad45d901f16","Type":"ContainerDied","Data":"cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94"} Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.234257 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.234280 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.234290 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd9mf\" (UniqueName: \"kubernetes.io/projected/621537d8-aeb6-42fa-842d-fb45f36c97f6-kube-api-access-wd9mf\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.234299 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.234307 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.234316 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/621537d8-aeb6-42fa-842d-fb45f36c97f6-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.236240 4795 generic.go:334] "Generic (PLEG): container finished" podID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerID="8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3" exitCode=0 Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.236268 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerDied","Data":"8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3"} Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.236296 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"621537d8-aeb6-42fa-842d-fb45f36c97f6","Type":"ContainerDied","Data":"ac61b3ede19f57aaca2d6d35d68c23fd53c055f4e5a6557a9c61b51dfdcd0f58"} Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.236311 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.236314 4795 scope.go:117] "RemoveContainer" containerID="be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.253330 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "621537d8-aeb6-42fa-842d-fb45f36c97f6" (UID: "621537d8-aeb6-42fa-842d-fb45f36c97f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.292286 4795 scope.go:117] "RemoveContainer" containerID="a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.310511 4795 scope.go:117] "RemoveContainer" containerID="8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.329653 4795 scope.go:117] "RemoveContainer" containerID="b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.340337 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621537d8-aeb6-42fa-842d-fb45f36c97f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.350944 4795 scope.go:117] "RemoveContainer" containerID="be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a" Feb 19 21:48:54 crc kubenswrapper[4795]: E0219 21:48:54.351518 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a\": container with ID starting with be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a not found: ID does not exist" containerID="be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.351553 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a"} err="failed to get container status \"be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a\": rpc error: code = NotFound desc = could not find container \"be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a\": container with ID starting with be0856d1ea223fc3a885a538e8752150b14a3fdcf5a3de8464ad71a3fb5d020a not found: ID does not exist" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.351584 4795 scope.go:117] "RemoveContainer" containerID="a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b" Feb 19 21:48:54 crc kubenswrapper[4795]: E0219 21:48:54.351985 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b\": container with ID starting with a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b not found: ID does not exist" containerID="a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.352019 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b"} err="failed to get container status \"a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b\": rpc error: code = NotFound desc = could not find container \"a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b\": container with ID starting with a788f3417d2afefc2a01186d42309d4971e8fc20f66a806d1ba37dc07835932b not found: ID does not exist" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.352038 4795 scope.go:117] "RemoveContainer" containerID="8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3" Feb 19 21:48:54 crc kubenswrapper[4795]: E0219 21:48:54.352362 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3\": container with ID starting with 8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3 not found: ID does not exist" containerID="8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.352400 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3"} err="failed to get container status \"8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3\": rpc error: code = NotFound desc = could not find container \"8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3\": container with ID starting with 8fe83d7d7776c6b0f45e4b4637592d93329691b60de8268539524e1a0a6e9ef3 not found: ID does not exist" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.352426 4795 scope.go:117] "RemoveContainer" containerID="b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d" Feb 19 21:48:54 crc kubenswrapper[4795]: E0219 21:48:54.352690 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d\": container with ID starting with b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d not found: ID does not exist" containerID="b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.352712 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d"} err="failed to get container status \"b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d\": rpc error: code = NotFound desc = could not find container \"b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d\": container with ID starting with b1b28d1c52ce3f489b010dcfd7b837144a7cef0672ea49ce480fe45e6bf3617d not found: ID does not exist" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.569093 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.581957 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.601455 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:54 crc kubenswrapper[4795]: E0219 21:48:54.601920 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="ceilometer-notification-agent" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.601946 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="ceilometer-notification-agent" Feb 19 21:48:54 crc kubenswrapper[4795]: E0219 21:48:54.601962 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="proxy-httpd" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.601970 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="proxy-httpd" Feb 19 21:48:54 crc kubenswrapper[4795]: E0219 21:48:54.601985 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="sg-core" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.601993 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="sg-core" Feb 19 21:48:54 crc kubenswrapper[4795]: E0219 21:48:54.602023 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="ceilometer-central-agent" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.602032 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="ceilometer-central-agent" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.602267 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="sg-core" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.602297 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="ceilometer-notification-agent" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.602314 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="proxy-httpd" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.602328 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" containerName="ceilometer-central-agent" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.604380 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.606398 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.609131 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.609467 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.614461 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.745916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.745961 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-scripts\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.745989 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-config-data\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.746047 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.746095 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.746276 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-run-httpd\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.746628 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-246kt\" (UniqueName: \"kubernetes.io/projected/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-kube-api-access-246kt\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.746691 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-log-httpd\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.848420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-246kt\" (UniqueName: \"kubernetes.io/projected/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-kube-api-access-246kt\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.848470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-log-httpd\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.848497 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.848517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-scripts\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.848540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-config-data\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.848582 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.848630 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.848702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-run-httpd\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.849352 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-run-httpd\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.849367 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-log-httpd\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.857727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.857842 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.858071 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.858177 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-config-data\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.864821 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-scripts\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.866902 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-246kt\" (UniqueName: \"kubernetes.io/projected/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-kube-api-access-246kt\") pod \"ceilometer-0\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " pod="openstack/ceilometer-0" Feb 19 21:48:54 crc kubenswrapper[4795]: I0219 21:48:54.919123 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:48:55 crc kubenswrapper[4795]: I0219 21:48:55.347875 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:55 crc kubenswrapper[4795]: W0219 21:48:55.353122 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01d2bcc0_aacc_413f_bc5e_36f3aa7a4ed5.slice/crio-86abd155d63f2c0bc5b785f25429ac5c2d08a30858af34af122f1895bdc9fbc8 WatchSource:0}: Error finding container 86abd155d63f2c0bc5b785f25429ac5c2d08a30858af34af122f1895bdc9fbc8: Status 404 returned error can't find the container with id 86abd155d63f2c0bc5b785f25429ac5c2d08a30858af34af122f1895bdc9fbc8 Feb 19 21:48:55 crc kubenswrapper[4795]: I0219 21:48:55.369129 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:48:55 crc kubenswrapper[4795]: I0219 21:48:55.521500 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="621537d8-aeb6-42fa-842d-fb45f36c97f6" path="/var/lib/kubelet/pods/621537d8-aeb6-42fa-842d-fb45f36c97f6/volumes" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.269788 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerStarted","Data":"323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66"} Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.270126 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerStarted","Data":"86abd155d63f2c0bc5b785f25429ac5c2d08a30858af34af122f1895bdc9fbc8"} Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.746135 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.894726 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01859e80-9d51-4db2-8a48-9ad45d901f16-logs\") pod \"01859e80-9d51-4db2-8a48-9ad45d901f16\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.894787 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-combined-ca-bundle\") pod \"01859e80-9d51-4db2-8a48-9ad45d901f16\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.894972 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-config-data\") pod \"01859e80-9d51-4db2-8a48-9ad45d901f16\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.895030 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9krq\" (UniqueName: \"kubernetes.io/projected/01859e80-9d51-4db2-8a48-9ad45d901f16-kube-api-access-s9krq\") pod \"01859e80-9d51-4db2-8a48-9ad45d901f16\" (UID: \"01859e80-9d51-4db2-8a48-9ad45d901f16\") " Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.895558 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01859e80-9d51-4db2-8a48-9ad45d901f16-logs" (OuterVolumeSpecName: "logs") pod "01859e80-9d51-4db2-8a48-9ad45d901f16" (UID: "01859e80-9d51-4db2-8a48-9ad45d901f16"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.907408 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01859e80-9d51-4db2-8a48-9ad45d901f16-kube-api-access-s9krq" (OuterVolumeSpecName: "kube-api-access-s9krq") pod "01859e80-9d51-4db2-8a48-9ad45d901f16" (UID: "01859e80-9d51-4db2-8a48-9ad45d901f16"). InnerVolumeSpecName "kube-api-access-s9krq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.930321 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-config-data" (OuterVolumeSpecName: "config-data") pod "01859e80-9d51-4db2-8a48-9ad45d901f16" (UID: "01859e80-9d51-4db2-8a48-9ad45d901f16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.942367 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01859e80-9d51-4db2-8a48-9ad45d901f16" (UID: "01859e80-9d51-4db2-8a48-9ad45d901f16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.997586 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.997625 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9krq\" (UniqueName: \"kubernetes.io/projected/01859e80-9d51-4db2-8a48-9ad45d901f16-kube-api-access-s9krq\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.997635 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01859e80-9d51-4db2-8a48-9ad45d901f16-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:56 crc kubenswrapper[4795]: I0219 21:48:56.997644 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01859e80-9d51-4db2-8a48-9ad45d901f16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.281219 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerStarted","Data":"d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52"} Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.283242 4795 generic.go:334] "Generic (PLEG): container finished" podID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerID="a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59" exitCode=0 Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.283397 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01859e80-9d51-4db2-8a48-9ad45d901f16","Type":"ContainerDied","Data":"a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59"} Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.283402 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.283514 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01859e80-9d51-4db2-8a48-9ad45d901f16","Type":"ContainerDied","Data":"4d23ebd426731022b2890eb99d72e3c34bf0eabb128346170d4b9e6c3457a311"} Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.283605 4795 scope.go:117] "RemoveContainer" containerID="a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.303205 4795 scope.go:117] "RemoveContainer" containerID="cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.320956 4795 scope.go:117] "RemoveContainer" containerID="a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59" Feb 19 21:48:57 crc kubenswrapper[4795]: E0219 21:48:57.321433 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59\": container with ID starting with a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59 not found: ID does not exist" containerID="a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.321465 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59"} err="failed to get container status \"a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59\": rpc error: code = NotFound desc = could not find container \"a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59\": container with ID starting with a08e7a43bb7eb9940934542dc40899cae39a62cfccafdc60ea968829591c4b59 not found: ID does not exist" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.321483 4795 scope.go:117] "RemoveContainer" containerID="cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94" Feb 19 21:48:57 crc kubenswrapper[4795]: E0219 21:48:57.321991 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94\": container with ID starting with cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94 not found: ID does not exist" containerID="cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.322012 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94"} err="failed to get container status \"cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94\": rpc error: code = NotFound desc = could not find container \"cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94\": container with ID starting with cd6c91a8bbdff67fb129ad494141346fb0c9ffacbd675189463e7bb2a6d75d94 not found: ID does not exist" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.325106 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.335424 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.345468 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:57 crc kubenswrapper[4795]: E0219 21:48:57.345822 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-log" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.345840 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-log" Feb 19 21:48:57 crc kubenswrapper[4795]: E0219 21:48:57.345866 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-api" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.345873 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-api" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.346131 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-api" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.346155 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" containerName="nova-api-log" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.349929 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.352199 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.352353 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.353937 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.356917 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.504782 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ecdad61-afa9-43fa-9321-1b58d9abf074-logs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.505285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.505517 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.505589 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-public-tls-certs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.505653 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhjf8\" (UniqueName: \"kubernetes.io/projected/7ecdad61-afa9-43fa-9321-1b58d9abf074-kube-api-access-qhjf8\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.505698 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-config-data\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.537308 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01859e80-9d51-4db2-8a48-9ad45d901f16" path="/var/lib/kubelet/pods/01859e80-9d51-4db2-8a48-9ad45d901f16/volumes" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.607029 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.607070 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-public-tls-certs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.607106 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhjf8\" (UniqueName: \"kubernetes.io/projected/7ecdad61-afa9-43fa-9321-1b58d9abf074-kube-api-access-qhjf8\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.607132 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-config-data\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.607208 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ecdad61-afa9-43fa-9321-1b58d9abf074-logs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.607295 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.610337 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-public-tls-certs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.610931 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ecdad61-afa9-43fa-9321-1b58d9abf074-logs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.611267 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.613285 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.615709 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-config-data\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.629117 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhjf8\" (UniqueName: \"kubernetes.io/projected/7ecdad61-afa9-43fa-9321-1b58d9abf074-kube-api-access-qhjf8\") pod \"nova-api-0\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.670061 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.904441 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:57 crc kubenswrapper[4795]: I0219 21:48:57.921422 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.132097 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:48:58 crc kubenswrapper[4795]: W0219 21:48:58.132788 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ecdad61_afa9_43fa_9321_1b58d9abf074.slice/crio-ffda1e0cd8af360a3425530f0623937f6ca5f730c7420ebf643f335d5eaf911c WatchSource:0}: Error finding container ffda1e0cd8af360a3425530f0623937f6ca5f730c7420ebf643f335d5eaf911c: Status 404 returned error can't find the container with id ffda1e0cd8af360a3425530f0623937f6ca5f730c7420ebf643f335d5eaf911c Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.315075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ecdad61-afa9-43fa-9321-1b58d9abf074","Type":"ContainerStarted","Data":"ffda1e0cd8af360a3425530f0623937f6ca5f730c7420ebf643f335d5eaf911c"} Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.324719 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerStarted","Data":"7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444"} Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.496196 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.729591 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-w8x5b"] Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.748572 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.749907 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-w8x5b"] Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.755754 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.756032 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.841464 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-scripts\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.841591 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.841626 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spfz4\" (UniqueName: \"kubernetes.io/projected/d1d3a710-addc-4f86-b77c-0d05dc98695f-kube-api-access-spfz4\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.841771 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-config-data\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.943896 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-scripts\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.943992 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.944027 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spfz4\" (UniqueName: \"kubernetes.io/projected/d1d3a710-addc-4f86-b77c-0d05dc98695f-kube-api-access-spfz4\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.944065 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-config-data\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.949092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.951448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-scripts\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.952763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-config-data\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:58 crc kubenswrapper[4795]: I0219 21:48:58.963899 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spfz4\" (UniqueName: \"kubernetes.io/projected/d1d3a710-addc-4f86-b77c-0d05dc98695f-kube-api-access-spfz4\") pod \"nova-cell1-cell-mapping-w8x5b\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:59 crc kubenswrapper[4795]: I0219 21:48:59.073704 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:48:59 crc kubenswrapper[4795]: I0219 21:48:59.347931 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ecdad61-afa9-43fa-9321-1b58d9abf074","Type":"ContainerStarted","Data":"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c"} Feb 19 21:48:59 crc kubenswrapper[4795]: I0219 21:48:59.348278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ecdad61-afa9-43fa-9321-1b58d9abf074","Type":"ContainerStarted","Data":"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc"} Feb 19 21:48:59 crc kubenswrapper[4795]: I0219 21:48:59.369632 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.369617664 podStartE2EDuration="2.369617664s" podCreationTimestamp="2026-02-19 21:48:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:48:59.369305275 +0000 UTC m=+1250.561823149" watchObservedRunningTime="2026-02-19 21:48:59.369617664 +0000 UTC m=+1250.562135518" Feb 19 21:48:59 crc kubenswrapper[4795]: I0219 21:48:59.505638 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-w8x5b"] Feb 19 21:48:59 crc kubenswrapper[4795]: W0219 21:48:59.509542 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1d3a710_addc_4f86_b77c_0d05dc98695f.slice/crio-51b9ea4c5a1378436eccbd6511b12e6d246bc54dae9f0ee487b98ec7869d4de6 WatchSource:0}: Error finding container 51b9ea4c5a1378436eccbd6511b12e6d246bc54dae9f0ee487b98ec7869d4de6: Status 404 returned error can't find the container with id 51b9ea4c5a1378436eccbd6511b12e6d246bc54dae9f0ee487b98ec7869d4de6 Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.356574 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-w8x5b" event={"ID":"d1d3a710-addc-4f86-b77c-0d05dc98695f","Type":"ContainerStarted","Data":"2f19263c7946bfa6317a4079eb988e89329f3723a790f4116428242859b8575d"} Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.357827 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-w8x5b" event={"ID":"d1d3a710-addc-4f86-b77c-0d05dc98695f","Type":"ContainerStarted","Data":"51b9ea4c5a1378436eccbd6511b12e6d246bc54dae9f0ee487b98ec7869d4de6"} Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.361594 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="ceilometer-central-agent" containerID="cri-o://323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66" gracePeriod=30 Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.361855 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="proxy-httpd" containerID="cri-o://8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a" gracePeriod=30 Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.361843 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerStarted","Data":"8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a"} Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.362487 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.361930 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="ceilometer-notification-agent" containerID="cri-o://d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52" gracePeriod=30 Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.361883 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="sg-core" containerID="cri-o://7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444" gracePeriod=30 Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.389229 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-w8x5b" podStartSLOduration=2.389204869 podStartE2EDuration="2.389204869s" podCreationTimestamp="2026-02-19 21:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:00.375136953 +0000 UTC m=+1251.567654817" watchObservedRunningTime="2026-02-19 21:49:00.389204869 +0000 UTC m=+1251.581722743" Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.412333 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.470677727 podStartE2EDuration="6.412309648s" podCreationTimestamp="2026-02-19 21:48:54 +0000 UTC" firstStartedPulling="2026-02-19 21:48:55.35525774 +0000 UTC m=+1246.547775604" lastFinishedPulling="2026-02-19 21:48:59.296889671 +0000 UTC m=+1250.489407525" observedRunningTime="2026-02-19 21:49:00.399327443 +0000 UTC m=+1251.591845307" watchObservedRunningTime="2026-02-19 21:49:00.412309648 +0000 UTC m=+1251.604827532" Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.733210 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.788933 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-f5qcb"] Feb 19 21:49:00 crc kubenswrapper[4795]: I0219 21:49:00.789215 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" podUID="59d807ee-6555-4c2f-8598-9f264d5a95f9" containerName="dnsmasq-dns" containerID="cri-o://fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f" gracePeriod=10 Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.310567 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.371391 4795 generic.go:334] "Generic (PLEG): container finished" podID="59d807ee-6555-4c2f-8598-9f264d5a95f9" containerID="fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f" exitCode=0 Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.371729 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" event={"ID":"59d807ee-6555-4c2f-8598-9f264d5a95f9","Type":"ContainerDied","Data":"fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f"} Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.371757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" event={"ID":"59d807ee-6555-4c2f-8598-9f264d5a95f9","Type":"ContainerDied","Data":"919c6d3f60ce379525cac92e740a7e9dd73e2c7e47ddc1efebc10b6306ebfc94"} Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.371773 4795 scope.go:117] "RemoveContainer" containerID="fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.371890 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-f5qcb" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.397077 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-sb\") pod \"59d807ee-6555-4c2f-8598-9f264d5a95f9\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.397132 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-swift-storage-0\") pod \"59d807ee-6555-4c2f-8598-9f264d5a95f9\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.397219 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zrmm\" (UniqueName: \"kubernetes.io/projected/59d807ee-6555-4c2f-8598-9f264d5a95f9-kube-api-access-5zrmm\") pod \"59d807ee-6555-4c2f-8598-9f264d5a95f9\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.397257 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-svc\") pod \"59d807ee-6555-4c2f-8598-9f264d5a95f9\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.397297 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-nb\") pod \"59d807ee-6555-4c2f-8598-9f264d5a95f9\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.397397 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-config\") pod \"59d807ee-6555-4c2f-8598-9f264d5a95f9\" (UID: \"59d807ee-6555-4c2f-8598-9f264d5a95f9\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.399374 4795 generic.go:334] "Generic (PLEG): container finished" podID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerID="8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a" exitCode=0 Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.399430 4795 generic.go:334] "Generic (PLEG): container finished" podID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerID="7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444" exitCode=2 Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.399439 4795 generic.go:334] "Generic (PLEG): container finished" podID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerID="d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52" exitCode=0 Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.400181 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerDied","Data":"8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a"} Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.400233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerDied","Data":"7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444"} Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.400245 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerDied","Data":"d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52"} Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.407337 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59d807ee-6555-4c2f-8598-9f264d5a95f9-kube-api-access-5zrmm" (OuterVolumeSpecName: "kube-api-access-5zrmm") pod "59d807ee-6555-4c2f-8598-9f264d5a95f9" (UID: "59d807ee-6555-4c2f-8598-9f264d5a95f9"). InnerVolumeSpecName "kube-api-access-5zrmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.449322 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "59d807ee-6555-4c2f-8598-9f264d5a95f9" (UID: "59d807ee-6555-4c2f-8598-9f264d5a95f9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.451710 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "59d807ee-6555-4c2f-8598-9f264d5a95f9" (UID: "59d807ee-6555-4c2f-8598-9f264d5a95f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.454917 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "59d807ee-6555-4c2f-8598-9f264d5a95f9" (UID: "59d807ee-6555-4c2f-8598-9f264d5a95f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.461944 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "59d807ee-6555-4c2f-8598-9f264d5a95f9" (UID: "59d807ee-6555-4c2f-8598-9f264d5a95f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.474251 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-config" (OuterVolumeSpecName: "config") pod "59d807ee-6555-4c2f-8598-9f264d5a95f9" (UID: "59d807ee-6555-4c2f-8598-9f264d5a95f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.503331 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zrmm\" (UniqueName: \"kubernetes.io/projected/59d807ee-6555-4c2f-8598-9f264d5a95f9-kube-api-access-5zrmm\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.503363 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.503375 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.503384 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.503393 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.503403 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59d807ee-6555-4c2f-8598-9f264d5a95f9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.595946 4795 scope.go:117] "RemoveContainer" containerID="c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.619423 4795 scope.go:117] "RemoveContainer" containerID="fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f" Feb 19 21:49:01 crc kubenswrapper[4795]: E0219 21:49:01.625896 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f\": container with ID starting with fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f not found: ID does not exist" containerID="fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.625926 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f"} err="failed to get container status \"fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f\": rpc error: code = NotFound desc = could not find container \"fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f\": container with ID starting with fd261f043799d7df60456eb5576e328975c0d9086db0abbd02ad64c7bf8efe3f not found: ID does not exist" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.625949 4795 scope.go:117] "RemoveContainer" containerID="c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895" Feb 19 21:49:01 crc kubenswrapper[4795]: E0219 21:49:01.626443 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895\": container with ID starting with c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895 not found: ID does not exist" containerID="c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.626524 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895"} err="failed to get container status \"c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895\": rpc error: code = NotFound desc = could not find container \"c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895\": container with ID starting with c4b13b710aaac472f707d3124011993b912c3a0c284fec1b64e6081245b45895 not found: ID does not exist" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.683201 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.702285 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-f5qcb"] Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.719063 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-f5qcb"] Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.821136 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-246kt\" (UniqueName: \"kubernetes.io/projected/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-kube-api-access-246kt\") pod \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.821218 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-log-httpd\") pod \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.821294 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-sg-core-conf-yaml\") pod \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.821386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-run-httpd\") pod \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.821411 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-config-data\") pod \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.821428 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-scripts\") pod \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.821458 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-combined-ca-bundle\") pod \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.821506 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-ceilometer-tls-certs\") pod \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\" (UID: \"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5\") " Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.822025 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" (UID: "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.822084 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" (UID: "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.827472 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-scripts" (OuterVolumeSpecName: "scripts") pod "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" (UID: "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.827613 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-kube-api-access-246kt" (OuterVolumeSpecName: "kube-api-access-246kt") pod "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" (UID: "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5"). InnerVolumeSpecName "kube-api-access-246kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.864916 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" (UID: "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.892049 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" (UID: "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.924034 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.924083 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-246kt\" (UniqueName: \"kubernetes.io/projected/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-kube-api-access-246kt\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.924102 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.924119 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.924134 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.924149 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.938148 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" (UID: "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:01 crc kubenswrapper[4795]: I0219 21:49:01.962681 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-config-data" (OuterVolumeSpecName: "config-data") pod "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" (UID: "01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.025370 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.025403 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.410497 4795 generic.go:334] "Generic (PLEG): container finished" podID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerID="323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66" exitCode=0 Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.410556 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.410571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerDied","Data":"323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66"} Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.411703 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5","Type":"ContainerDied","Data":"86abd155d63f2c0bc5b785f25429ac5c2d08a30858af34af122f1895bdc9fbc8"} Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.411722 4795 scope.go:117] "RemoveContainer" containerID="8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.440332 4795 scope.go:117] "RemoveContainer" containerID="7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.448271 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.458540 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.471598 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.471980 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="proxy-httpd" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.471997 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="proxy-httpd" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.472018 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="ceilometer-central-agent" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472027 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="ceilometer-central-agent" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.472041 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="sg-core" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472046 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="sg-core" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.472057 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d807ee-6555-4c2f-8598-9f264d5a95f9" containerName="init" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472064 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d807ee-6555-4c2f-8598-9f264d5a95f9" containerName="init" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.472079 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d807ee-6555-4c2f-8598-9f264d5a95f9" containerName="dnsmasq-dns" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472085 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d807ee-6555-4c2f-8598-9f264d5a95f9" containerName="dnsmasq-dns" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.472103 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="ceilometer-notification-agent" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472109 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="ceilometer-notification-agent" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472278 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="59d807ee-6555-4c2f-8598-9f264d5a95f9" containerName="dnsmasq-dns" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472287 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="proxy-httpd" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472297 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="sg-core" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472310 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="ceilometer-notification-agent" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.472328 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" containerName="ceilometer-central-agent" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.473945 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.475858 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.477352 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.477535 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.478997 4795 scope.go:117] "RemoveContainer" containerID="d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.500461 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.523994 4795 scope.go:117] "RemoveContainer" containerID="323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.541159 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-log-httpd\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.541382 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.541415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.541431 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-config-data\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.541488 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.541566 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-829s7\" (UniqueName: \"kubernetes.io/projected/e956453d-551f-44b4-8125-8656b3155402-kube-api-access-829s7\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.541587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-run-httpd\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.541624 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-scripts\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.578396 4795 scope.go:117] "RemoveContainer" containerID="8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.579580 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a\": container with ID starting with 8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a not found: ID does not exist" containerID="8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.579628 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a"} err="failed to get container status \"8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a\": rpc error: code = NotFound desc = could not find container \"8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a\": container with ID starting with 8ac5d19e61a3b12f19dd54a9a359ff3a6b479d8b61e7d3ae285bef106f9f3f3a not found: ID does not exist" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.579650 4795 scope.go:117] "RemoveContainer" containerID="7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.579936 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444\": container with ID starting with 7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444 not found: ID does not exist" containerID="7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.580015 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444"} err="failed to get container status \"7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444\": rpc error: code = NotFound desc = could not find container \"7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444\": container with ID starting with 7f760caa528ce072e5792c18e9f808627474b8c013128a1c46f50177df48e444 not found: ID does not exist" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.580091 4795 scope.go:117] "RemoveContainer" containerID="d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.580327 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52\": container with ID starting with d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52 not found: ID does not exist" containerID="d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.580351 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52"} err="failed to get container status \"d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52\": rpc error: code = NotFound desc = could not find container \"d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52\": container with ID starting with d031e57a2a8228a96f6d8c7aec2dddf131a18030a5301dfcd9286e4c94fa3f52 not found: ID does not exist" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.580365 4795 scope.go:117] "RemoveContainer" containerID="323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66" Feb 19 21:49:02 crc kubenswrapper[4795]: E0219 21:49:02.581339 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66\": container with ID starting with 323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66 not found: ID does not exist" containerID="323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.581365 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66"} err="failed to get container status \"323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66\": rpc error: code = NotFound desc = could not find container \"323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66\": container with ID starting with 323b724b3b048a0396352d1a1fdda411a293b7403fd5ff6b7c5a586be082ee66 not found: ID does not exist" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.643942 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-829s7\" (UniqueName: \"kubernetes.io/projected/e956453d-551f-44b4-8125-8656b3155402-kube-api-access-829s7\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.643993 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-run-httpd\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.644043 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-scripts\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.644095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-log-httpd\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.644158 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.644216 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.644239 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-config-data\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.644314 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.646035 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-log-httpd\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.647539 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-run-httpd\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.649536 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-scripts\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.650414 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-config-data\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.652520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.654798 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.660177 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.662305 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-829s7\" (UniqueName: \"kubernetes.io/projected/e956453d-551f-44b4-8125-8656b3155402-kube-api-access-829s7\") pod \"ceilometer-0\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " pod="openstack/ceilometer-0" Feb 19 21:49:02 crc kubenswrapper[4795]: I0219 21:49:02.797798 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:49:03 crc kubenswrapper[4795]: I0219 21:49:03.222695 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:03 crc kubenswrapper[4795]: I0219 21:49:03.425493 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerStarted","Data":"9d09fb4d826d8602127fabff658a8440e51f38b0c8a942f510e29c6808527ef7"} Feb 19 21:49:03 crc kubenswrapper[4795]: I0219 21:49:03.534925 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5" path="/var/lib/kubelet/pods/01d2bcc0-aacc-413f-bc5e-36f3aa7a4ed5/volumes" Feb 19 21:49:03 crc kubenswrapper[4795]: I0219 21:49:03.536271 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59d807ee-6555-4c2f-8598-9f264d5a95f9" path="/var/lib/kubelet/pods/59d807ee-6555-4c2f-8598-9f264d5a95f9/volumes" Feb 19 21:49:04 crc kubenswrapper[4795]: I0219 21:49:04.436537 4795 generic.go:334] "Generic (PLEG): container finished" podID="d1d3a710-addc-4f86-b77c-0d05dc98695f" containerID="2f19263c7946bfa6317a4079eb988e89329f3723a790f4116428242859b8575d" exitCode=0 Feb 19 21:49:04 crc kubenswrapper[4795]: I0219 21:49:04.436662 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-w8x5b" event={"ID":"d1d3a710-addc-4f86-b77c-0d05dc98695f","Type":"ContainerDied","Data":"2f19263c7946bfa6317a4079eb988e89329f3723a790f4116428242859b8575d"} Feb 19 21:49:04 crc kubenswrapper[4795]: I0219 21:49:04.441396 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerStarted","Data":"a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552"} Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.458065 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerStarted","Data":"98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734"} Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.813124 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.907999 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-combined-ca-bundle\") pod \"d1d3a710-addc-4f86-b77c-0d05dc98695f\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.908195 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-scripts\") pod \"d1d3a710-addc-4f86-b77c-0d05dc98695f\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.908261 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spfz4\" (UniqueName: \"kubernetes.io/projected/d1d3a710-addc-4f86-b77c-0d05dc98695f-kube-api-access-spfz4\") pod \"d1d3a710-addc-4f86-b77c-0d05dc98695f\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.908316 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-config-data\") pod \"d1d3a710-addc-4f86-b77c-0d05dc98695f\" (UID: \"d1d3a710-addc-4f86-b77c-0d05dc98695f\") " Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.915232 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1d3a710-addc-4f86-b77c-0d05dc98695f-kube-api-access-spfz4" (OuterVolumeSpecName: "kube-api-access-spfz4") pod "d1d3a710-addc-4f86-b77c-0d05dc98695f" (UID: "d1d3a710-addc-4f86-b77c-0d05dc98695f"). InnerVolumeSpecName "kube-api-access-spfz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.921298 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-scripts" (OuterVolumeSpecName: "scripts") pod "d1d3a710-addc-4f86-b77c-0d05dc98695f" (UID: "d1d3a710-addc-4f86-b77c-0d05dc98695f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.938019 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-config-data" (OuterVolumeSpecName: "config-data") pod "d1d3a710-addc-4f86-b77c-0d05dc98695f" (UID: "d1d3a710-addc-4f86-b77c-0d05dc98695f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:05 crc kubenswrapper[4795]: I0219 21:49:05.940416 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1d3a710-addc-4f86-b77c-0d05dc98695f" (UID: "d1d3a710-addc-4f86-b77c-0d05dc98695f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.010401 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.010444 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spfz4\" (UniqueName: \"kubernetes.io/projected/d1d3a710-addc-4f86-b77c-0d05dc98695f-kube-api-access-spfz4\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.010460 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.010472 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1d3a710-addc-4f86-b77c-0d05dc98695f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.473111 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerStarted","Data":"21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb"} Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.476254 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-w8x5b" event={"ID":"d1d3a710-addc-4f86-b77c-0d05dc98695f","Type":"ContainerDied","Data":"51b9ea4c5a1378436eccbd6511b12e6d246bc54dae9f0ee487b98ec7869d4de6"} Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.476286 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51b9ea4c5a1378436eccbd6511b12e6d246bc54dae9f0ee487b98ec7869d4de6" Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.476420 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-w8x5b" Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.634459 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.634726 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerName="nova-api-log" containerID="cri-o://7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc" gracePeriod=30 Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.634864 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerName="nova-api-api" containerID="cri-o://5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c" gracePeriod=30 Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.648042 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.648285 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c0f49585-0601-424b-9f28-304ae06c9d93" containerName="nova-scheduler-scheduler" containerID="cri-o://a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5" gracePeriod=30 Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.695668 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.696059 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-log" containerID="cri-o://fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145" gracePeriod=30 Feb 19 21:49:06 crc kubenswrapper[4795]: I0219 21:49:06.696241 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-metadata" containerID="cri-o://a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294" gracePeriod=30 Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.152664 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.272717 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhjf8\" (UniqueName: \"kubernetes.io/projected/7ecdad61-afa9-43fa-9321-1b58d9abf074-kube-api-access-qhjf8\") pod \"7ecdad61-afa9-43fa-9321-1b58d9abf074\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.272846 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-config-data\") pod \"7ecdad61-afa9-43fa-9321-1b58d9abf074\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.272897 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-combined-ca-bundle\") pod \"7ecdad61-afa9-43fa-9321-1b58d9abf074\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.272963 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-public-tls-certs\") pod \"7ecdad61-afa9-43fa-9321-1b58d9abf074\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.273038 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ecdad61-afa9-43fa-9321-1b58d9abf074-logs\") pod \"7ecdad61-afa9-43fa-9321-1b58d9abf074\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.273063 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-internal-tls-certs\") pod \"7ecdad61-afa9-43fa-9321-1b58d9abf074\" (UID: \"7ecdad61-afa9-43fa-9321-1b58d9abf074\") " Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.273963 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ecdad61-afa9-43fa-9321-1b58d9abf074-logs" (OuterVolumeSpecName: "logs") pod "7ecdad61-afa9-43fa-9321-1b58d9abf074" (UID: "7ecdad61-afa9-43fa-9321-1b58d9abf074"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.276891 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ecdad61-afa9-43fa-9321-1b58d9abf074-kube-api-access-qhjf8" (OuterVolumeSpecName: "kube-api-access-qhjf8") pod "7ecdad61-afa9-43fa-9321-1b58d9abf074" (UID: "7ecdad61-afa9-43fa-9321-1b58d9abf074"). InnerVolumeSpecName "kube-api-access-qhjf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.306931 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-config-data" (OuterVolumeSpecName: "config-data") pod "7ecdad61-afa9-43fa-9321-1b58d9abf074" (UID: "7ecdad61-afa9-43fa-9321-1b58d9abf074"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.313236 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ecdad61-afa9-43fa-9321-1b58d9abf074" (UID: "7ecdad61-afa9-43fa-9321-1b58d9abf074"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.327476 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7ecdad61-afa9-43fa-9321-1b58d9abf074" (UID: "7ecdad61-afa9-43fa-9321-1b58d9abf074"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.339859 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7ecdad61-afa9-43fa-9321-1b58d9abf074" (UID: "7ecdad61-afa9-43fa-9321-1b58d9abf074"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.374899 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.374933 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.374946 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.374956 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ecdad61-afa9-43fa-9321-1b58d9abf074-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.374964 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ecdad61-afa9-43fa-9321-1b58d9abf074-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.374972 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhjf8\" (UniqueName: \"kubernetes.io/projected/7ecdad61-afa9-43fa-9321-1b58d9abf074-kube-api-access-qhjf8\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.486549 4795 generic.go:334] "Generic (PLEG): container finished" podID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerID="5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c" exitCode=0 Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.486593 4795 generic.go:334] "Generic (PLEG): container finished" podID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerID="7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc" exitCode=143 Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.486612 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.486639 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ecdad61-afa9-43fa-9321-1b58d9abf074","Type":"ContainerDied","Data":"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c"} Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.487056 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ecdad61-afa9-43fa-9321-1b58d9abf074","Type":"ContainerDied","Data":"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc"} Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.487083 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7ecdad61-afa9-43fa-9321-1b58d9abf074","Type":"ContainerDied","Data":"ffda1e0cd8af360a3425530f0623937f6ca5f730c7420ebf643f335d5eaf911c"} Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.487104 4795 scope.go:117] "RemoveContainer" containerID="5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.494100 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerStarted","Data":"c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26"} Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.496448 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.500179 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerID="fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145" exitCode=143 Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.500230 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8","Type":"ContainerDied","Data":"fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145"} Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.531750 4795 scope.go:117] "RemoveContainer" containerID="7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.543803 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7829972299999999 podStartE2EDuration="5.543784058s" podCreationTimestamp="2026-02-19 21:49:02 +0000 UTC" firstStartedPulling="2026-02-19 21:49:03.226061122 +0000 UTC m=+1254.418579036" lastFinishedPulling="2026-02-19 21:49:06.986848 +0000 UTC m=+1258.179365864" observedRunningTime="2026-02-19 21:49:07.519757163 +0000 UTC m=+1258.712275027" watchObservedRunningTime="2026-02-19 21:49:07.543784058 +0000 UTC m=+1258.736301922" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.547149 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.555529 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.562394 4795 scope.go:117] "RemoveContainer" containerID="5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c" Feb 19 21:49:07 crc kubenswrapper[4795]: E0219 21:49:07.562836 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c\": container with ID starting with 5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c not found: ID does not exist" containerID="5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.562879 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c"} err="failed to get container status \"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c\": rpc error: code = NotFound desc = could not find container \"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c\": container with ID starting with 5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c not found: ID does not exist" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.562910 4795 scope.go:117] "RemoveContainer" containerID="7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc" Feb 19 21:49:07 crc kubenswrapper[4795]: E0219 21:49:07.563273 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc\": container with ID starting with 7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc not found: ID does not exist" containerID="7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.563316 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc"} err="failed to get container status \"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc\": rpc error: code = NotFound desc = could not find container \"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc\": container with ID starting with 7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc not found: ID does not exist" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.563342 4795 scope.go:117] "RemoveContainer" containerID="5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.563612 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c"} err="failed to get container status \"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c\": rpc error: code = NotFound desc = could not find container \"5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c\": container with ID starting with 5d562a786717cf3ad172d6766f5c7acfd5e591e4923212ea6cb2bbdaecf8c27c not found: ID does not exist" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.563640 4795 scope.go:117] "RemoveContainer" containerID="7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.563926 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc"} err="failed to get container status \"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc\": rpc error: code = NotFound desc = could not find container \"7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc\": container with ID starting with 7c6fddf4caa897f831f79a6706bca2d6370ef148f42b02873b2746d104b46ecc not found: ID does not exist" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.571878 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 21:49:07 crc kubenswrapper[4795]: E0219 21:49:07.572364 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerName="nova-api-api" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.572383 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerName="nova-api-api" Feb 19 21:49:07 crc kubenswrapper[4795]: E0219 21:49:07.572398 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1d3a710-addc-4f86-b77c-0d05dc98695f" containerName="nova-manage" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.572404 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d3a710-addc-4f86-b77c-0d05dc98695f" containerName="nova-manage" Feb 19 21:49:07 crc kubenswrapper[4795]: E0219 21:49:07.572417 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerName="nova-api-log" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.572424 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerName="nova-api-log" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.572593 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerName="nova-api-api" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.572615 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1d3a710-addc-4f86-b77c-0d05dc98695f" containerName="nova-manage" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.572626 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" containerName="nova-api-log" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.573597 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.575968 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.577547 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.578520 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.600854 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.684823 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/793bbadc-8b53-4084-a63a-0b76b37284df-logs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.684900 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-internal-tls-certs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.684936 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-public-tls-certs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.684969 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-config-data\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.685001 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sbjg\" (UniqueName: \"kubernetes.io/projected/793bbadc-8b53-4084-a63a-0b76b37284df-kube-api-access-5sbjg\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.685061 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.786489 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/793bbadc-8b53-4084-a63a-0b76b37284df-logs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.786589 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-internal-tls-certs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.786624 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-public-tls-certs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.786665 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-config-data\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.786708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sbjg\" (UniqueName: \"kubernetes.io/projected/793bbadc-8b53-4084-a63a-0b76b37284df-kube-api-access-5sbjg\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.786788 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.787124 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/793bbadc-8b53-4084-a63a-0b76b37284df-logs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.791560 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-internal-tls-certs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.791816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.794142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-config-data\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.794957 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-public-tls-certs\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.805849 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sbjg\" (UniqueName: \"kubernetes.io/projected/793bbadc-8b53-4084-a63a-0b76b37284df-kube-api-access-5sbjg\") pod \"nova-api-0\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " pod="openstack/nova-api-0" Feb 19 21:49:07 crc kubenswrapper[4795]: I0219 21:49:07.904237 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:49:08 crc kubenswrapper[4795]: E0219 21:49:08.274442 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:49:08 crc kubenswrapper[4795]: E0219 21:49:08.276394 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:49:08 crc kubenswrapper[4795]: E0219 21:49:08.278956 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:49:08 crc kubenswrapper[4795]: E0219 21:49:08.279015 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c0f49585-0601-424b-9f28-304ae06c9d93" containerName="nova-scheduler-scheduler" Feb 19 21:49:08 crc kubenswrapper[4795]: I0219 21:49:08.340285 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:49:08 crc kubenswrapper[4795]: I0219 21:49:08.515983 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"793bbadc-8b53-4084-a63a-0b76b37284df","Type":"ContainerStarted","Data":"14cc3bd11f9f9a1b0b976a11e87f576616488b4c4d4dfa8a49e1d97fcc43ddfd"} Feb 19 21:49:09 crc kubenswrapper[4795]: I0219 21:49:09.526453 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ecdad61-afa9-43fa-9321-1b58d9abf074" path="/var/lib/kubelet/pods/7ecdad61-afa9-43fa-9321-1b58d9abf074/volumes" Feb 19 21:49:09 crc kubenswrapper[4795]: I0219 21:49:09.536951 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"793bbadc-8b53-4084-a63a-0b76b37284df","Type":"ContainerStarted","Data":"5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70"} Feb 19 21:49:09 crc kubenswrapper[4795]: I0219 21:49:09.536991 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"793bbadc-8b53-4084-a63a-0b76b37284df","Type":"ContainerStarted","Data":"4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681"} Feb 19 21:49:09 crc kubenswrapper[4795]: I0219 21:49:09.567011 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.566988611 podStartE2EDuration="2.566988611s" podCreationTimestamp="2026-02-19 21:49:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:09.562396212 +0000 UTC m=+1260.754914156" watchObservedRunningTime="2026-02-19 21:49:09.566988611 +0000 UTC m=+1260.759506495" Feb 19 21:49:09 crc kubenswrapper[4795]: I0219 21:49:09.834780 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:60428->10.217.0.195:8775: read: connection reset by peer" Feb 19 21:49:09 crc kubenswrapper[4795]: I0219 21:49:09.835224 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:60426->10.217.0.195:8775: read: connection reset by peer" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.295767 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.438711 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-config-data\") pod \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.438770 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-combined-ca-bundle\") pod \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.438863 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-nova-metadata-tls-certs\") pod \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.439010 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd6zt\" (UniqueName: \"kubernetes.io/projected/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-kube-api-access-fd6zt\") pod \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.439096 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-logs\") pod \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\" (UID: \"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8\") " Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.440210 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-logs" (OuterVolumeSpecName: "logs") pod "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" (UID: "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.450487 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-kube-api-access-fd6zt" (OuterVolumeSpecName: "kube-api-access-fd6zt") pod "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" (UID: "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8"). InnerVolumeSpecName "kube-api-access-fd6zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.477551 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" (UID: "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.479622 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-config-data" (OuterVolumeSpecName: "config-data") pod "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" (UID: "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.505337 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" (UID: "3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.541235 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd6zt\" (UniqueName: \"kubernetes.io/projected/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-kube-api-access-fd6zt\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.541266 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.541279 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.541293 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.541304 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.548881 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerID="a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294" exitCode=0 Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.548947 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.548953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8","Type":"ContainerDied","Data":"a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294"} Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.549043 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8","Type":"ContainerDied","Data":"d8263151d90162e0d834534389009f120a5ff2fd0c5a460719918f5dbd83bc95"} Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.549063 4795 scope.go:117] "RemoveContainer" containerID="a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.599355 4795 scope.go:117] "RemoveContainer" containerID="fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.637741 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.652379 4795 scope.go:117] "RemoveContainer" containerID="a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294" Feb 19 21:49:10 crc kubenswrapper[4795]: E0219 21:49:10.654906 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294\": container with ID starting with a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294 not found: ID does not exist" containerID="a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.654946 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294"} err="failed to get container status \"a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294\": rpc error: code = NotFound desc = could not find container \"a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294\": container with ID starting with a60101c154c65170b0cd0d24ec4c3534c546765af68be292c858d3e94c795294 not found: ID does not exist" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.654973 4795 scope.go:117] "RemoveContainer" containerID="fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.655055 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:49:10 crc kubenswrapper[4795]: E0219 21:49:10.655430 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145\": container with ID starting with fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145 not found: ID does not exist" containerID="fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.655450 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145"} err="failed to get container status \"fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145\": rpc error: code = NotFound desc = could not find container \"fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145\": container with ID starting with fac0b49dac38eb57e08779c1a5fd43f5502fc80faee614fd505dcf5a274e4145 not found: ID does not exist" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.663756 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:49:10 crc kubenswrapper[4795]: E0219 21:49:10.664194 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-metadata" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.664206 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-metadata" Feb 19 21:49:10 crc kubenswrapper[4795]: E0219 21:49:10.664223 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-log" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.664229 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-log" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.664575 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-log" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.664674 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" containerName="nova-metadata-metadata" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.665592 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.668930 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.669286 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.674647 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.745430 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxzt2\" (UniqueName: \"kubernetes.io/projected/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-kube-api-access-hxzt2\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.745497 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-config-data\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.745684 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.745821 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.745955 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-logs\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.847962 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-logs\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.848019 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxzt2\" (UniqueName: \"kubernetes.io/projected/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-kube-api-access-hxzt2\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.848080 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-config-data\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.848157 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.848244 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.849453 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-logs\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.853080 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.855833 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-config-data\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.856086 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.873730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxzt2\" (UniqueName: \"kubernetes.io/projected/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-kube-api-access-hxzt2\") pod \"nova-metadata-0\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " pod="openstack/nova-metadata-0" Feb 19 21:49:10 crc kubenswrapper[4795]: I0219 21:49:10.985830 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:49:11 crc kubenswrapper[4795]: I0219 21:49:11.463317 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:49:11 crc kubenswrapper[4795]: W0219 21:49:11.471718 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1e160dc_ca4c_45d8_ab73_5ddd1a7d2107.slice/crio-d49ebe84c757eba1bd2bd142f49a19380b1d9884de4a65242a0e36933f808c52 WatchSource:0}: Error finding container d49ebe84c757eba1bd2bd142f49a19380b1d9884de4a65242a0e36933f808c52: Status 404 returned error can't find the container with id d49ebe84c757eba1bd2bd142f49a19380b1d9884de4a65242a0e36933f808c52 Feb 19 21:49:11 crc kubenswrapper[4795]: I0219 21:49:11.524014 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8" path="/var/lib/kubelet/pods/3c0bc1cc-7985-4a3f-8ab8-26d49f7706c8/volumes" Feb 19 21:49:11 crc kubenswrapper[4795]: I0219 21:49:11.573715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107","Type":"ContainerStarted","Data":"d49ebe84c757eba1bd2bd142f49a19380b1d9884de4a65242a0e36933f808c52"} Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.119869 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.172788 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-config-data\") pod \"c0f49585-0601-424b-9f28-304ae06c9d93\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.172934 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-combined-ca-bundle\") pod \"c0f49585-0601-424b-9f28-304ae06c9d93\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.172981 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tmq5\" (UniqueName: \"kubernetes.io/projected/c0f49585-0601-424b-9f28-304ae06c9d93-kube-api-access-8tmq5\") pod \"c0f49585-0601-424b-9f28-304ae06c9d93\" (UID: \"c0f49585-0601-424b-9f28-304ae06c9d93\") " Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.179362 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0f49585-0601-424b-9f28-304ae06c9d93-kube-api-access-8tmq5" (OuterVolumeSpecName: "kube-api-access-8tmq5") pod "c0f49585-0601-424b-9f28-304ae06c9d93" (UID: "c0f49585-0601-424b-9f28-304ae06c9d93"). InnerVolumeSpecName "kube-api-access-8tmq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.206390 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-config-data" (OuterVolumeSpecName: "config-data") pod "c0f49585-0601-424b-9f28-304ae06c9d93" (UID: "c0f49585-0601-424b-9f28-304ae06c9d93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.212097 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0f49585-0601-424b-9f28-304ae06c9d93" (UID: "c0f49585-0601-424b-9f28-304ae06c9d93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.274936 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.274979 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f49585-0601-424b-9f28-304ae06c9d93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.274993 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tmq5\" (UniqueName: \"kubernetes.io/projected/c0f49585-0601-424b-9f28-304ae06c9d93-kube-api-access-8tmq5\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.582725 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107","Type":"ContainerStarted","Data":"11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5"} Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.582773 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107","Type":"ContainerStarted","Data":"a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37"} Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.584715 4795 generic.go:334] "Generic (PLEG): container finished" podID="c0f49585-0601-424b-9f28-304ae06c9d93" containerID="a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5" exitCode=0 Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.584779 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.584775 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0f49585-0601-424b-9f28-304ae06c9d93","Type":"ContainerDied","Data":"a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5"} Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.584884 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0f49585-0601-424b-9f28-304ae06c9d93","Type":"ContainerDied","Data":"cd865b730fa4ce805f52ae67f6b00c0275c97a6018c4d5724e64d28e7cd4b5db"} Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.584906 4795 scope.go:117] "RemoveContainer" containerID="a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.608627 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.608606975 podStartE2EDuration="2.608606975s" podCreationTimestamp="2026-02-19 21:49:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:12.606744813 +0000 UTC m=+1263.799262697" watchObservedRunningTime="2026-02-19 21:49:12.608606975 +0000 UTC m=+1263.801124879" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.634086 4795 scope.go:117] "RemoveContainer" containerID="a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5" Feb 19 21:49:12 crc kubenswrapper[4795]: E0219 21:49:12.637012 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5\": container with ID starting with a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5 not found: ID does not exist" containerID="a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.637060 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5"} err="failed to get container status \"a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5\": rpc error: code = NotFound desc = could not find container \"a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5\": container with ID starting with a293322e7ade93b6e5ee39b06413dcdb22fcc62b8715355c0228e95452255ef5 not found: ID does not exist" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.652488 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.664745 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.692269 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:49:12 crc kubenswrapper[4795]: E0219 21:49:12.692719 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f49585-0601-424b-9f28-304ae06c9d93" containerName="nova-scheduler-scheduler" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.692738 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f49585-0601-424b-9f28-304ae06c9d93" containerName="nova-scheduler-scheduler" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.692958 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0f49585-0601-424b-9f28-304ae06c9d93" containerName="nova-scheduler-scheduler" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.693729 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.698788 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.705084 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.783587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-config-data\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.783647 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5wq7\" (UniqueName: \"kubernetes.io/projected/f2710b23-7a5c-44cb-b916-9e08edc59636-kube-api-access-t5wq7\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.783808 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.885515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.885812 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-config-data\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.885929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5wq7\" (UniqueName: \"kubernetes.io/projected/f2710b23-7a5c-44cb-b916-9e08edc59636-kube-api-access-t5wq7\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.890092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-config-data\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.890559 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:12 crc kubenswrapper[4795]: I0219 21:49:12.918242 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5wq7\" (UniqueName: \"kubernetes.io/projected/f2710b23-7a5c-44cb-b916-9e08edc59636-kube-api-access-t5wq7\") pod \"nova-scheduler-0\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " pod="openstack/nova-scheduler-0" Feb 19 21:49:13 crc kubenswrapper[4795]: I0219 21:49:13.010604 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:49:13 crc kubenswrapper[4795]: I0219 21:49:13.473289 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:49:13 crc kubenswrapper[4795]: I0219 21:49:13.524690 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0f49585-0601-424b-9f28-304ae06c9d93" path="/var/lib/kubelet/pods/c0f49585-0601-424b-9f28-304ae06c9d93/volumes" Feb 19 21:49:13 crc kubenswrapper[4795]: I0219 21:49:13.594717 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f2710b23-7a5c-44cb-b916-9e08edc59636","Type":"ContainerStarted","Data":"18a795e7f80bb780eadeb9ae01b9659d15da8639c51f358e1baf726a07014084"} Feb 19 21:49:14 crc kubenswrapper[4795]: I0219 21:49:14.604650 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f2710b23-7a5c-44cb-b916-9e08edc59636","Type":"ContainerStarted","Data":"f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6"} Feb 19 21:49:14 crc kubenswrapper[4795]: I0219 21:49:14.626684 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6266681419999998 podStartE2EDuration="2.626668142s" podCreationTimestamp="2026-02-19 21:49:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 21:49:14.617567007 +0000 UTC m=+1265.810084871" watchObservedRunningTime="2026-02-19 21:49:14.626668142 +0000 UTC m=+1265.819186006" Feb 19 21:49:15 crc kubenswrapper[4795]: I0219 21:49:15.986679 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:49:15 crc kubenswrapper[4795]: I0219 21:49:15.987101 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 21:49:18 crc kubenswrapper[4795]: I0219 21:49:18.225655 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 21:49:18 crc kubenswrapper[4795]: I0219 21:49:18.228055 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:49:18 crc kubenswrapper[4795]: I0219 21:49:18.229202 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 21:49:19 crc kubenswrapper[4795]: I0219 21:49:19.311399 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:49:19 crc kubenswrapper[4795]: I0219 21:49:19.311523 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:49:20 crc kubenswrapper[4795]: I0219 21:49:20.986436 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 21:49:20 crc kubenswrapper[4795]: I0219 21:49:20.986956 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 21:49:22 crc kubenswrapper[4795]: I0219 21:49:22.005275 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:49:22 crc kubenswrapper[4795]: I0219 21:49:22.005510 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 21:49:23 crc kubenswrapper[4795]: I0219 21:49:23.011085 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 21:49:23 crc kubenswrapper[4795]: I0219 21:49:23.052466 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 21:49:23 crc kubenswrapper[4795]: I0219 21:49:23.359036 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 21:49:27 crc kubenswrapper[4795]: I0219 21:49:27.918963 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 21:49:27 crc kubenswrapper[4795]: I0219 21:49:27.920845 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 21:49:27 crc kubenswrapper[4795]: I0219 21:49:27.921049 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 21:49:27 crc kubenswrapper[4795]: I0219 21:49:27.932128 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 21:49:28 crc kubenswrapper[4795]: I0219 21:49:28.412225 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 21:49:28 crc kubenswrapper[4795]: I0219 21:49:28.420399 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 21:49:28 crc kubenswrapper[4795]: I0219 21:49:28.427875 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:49:28 crc kubenswrapper[4795]: I0219 21:49:28.427937 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:49:30 crc kubenswrapper[4795]: I0219 21:49:30.992968 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 21:49:30 crc kubenswrapper[4795]: I0219 21:49:30.993768 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 21:49:30 crc kubenswrapper[4795]: I0219 21:49:30.998915 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 21:49:31 crc kubenswrapper[4795]: I0219 21:49:31.000865 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 21:49:32 crc kubenswrapper[4795]: I0219 21:49:32.807093 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.729901 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.730538 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="336beec4-e534-448f-8367-78645b53650e" containerName="openstackclient" containerID="cri-o://6f5fd7fb0db869022abdd3586a7debaf90502df56c7437b86541c8afbbd3687a" gracePeriod=2 Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.750106 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.794736 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ktl2b"] Feb 19 21:49:53 crc kubenswrapper[4795]: E0219 21:49:53.798670 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="336beec4-e534-448f-8367-78645b53650e" containerName="openstackclient" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.798777 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="336beec4-e534-448f-8367-78645b53650e" containerName="openstackclient" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.799062 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="336beec4-e534-448f-8367-78645b53650e" containerName="openstackclient" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.799689 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ktl2b" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.809624 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.828511 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ktl2b"] Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.900087 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d602-account-create-update-lcd8k"] Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.901313 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.905788 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.943389 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jr6xc"] Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.962229 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jr6xc"] Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.970426 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d602-account-create-update-lcd8k"] Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.980098 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kll6\" (UniqueName: \"kubernetes.io/projected/2164f9d1-1d8b-486b-beca-0d3a5172b302-kube-api-access-9kll6\") pod \"root-account-create-update-ktl2b\" (UID: \"2164f9d1-1d8b-486b-beca-0d3a5172b302\") " pod="openstack/root-account-create-update-ktl2b" Feb 19 21:49:53 crc kubenswrapper[4795]: I0219 21:49:53.980210 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts\") pod \"root-account-create-update-ktl2b\" (UID: \"2164f9d1-1d8b-486b-beca-0d3a5172b302\") " pod="openstack/root-account-create-update-ktl2b" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.003347 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d602-account-create-update-mc6fv"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.003878 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d602-account-create-update-mc6fv"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.085125 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w65w5\" (UniqueName: \"kubernetes.io/projected/10e13a52-b0f3-447a-b47e-2c4dd50d6400-kube-api-access-w65w5\") pod \"barbican-d602-account-create-update-lcd8k\" (UID: \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\") " pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.085214 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kll6\" (UniqueName: \"kubernetes.io/projected/2164f9d1-1d8b-486b-beca-0d3a5172b302-kube-api-access-9kll6\") pod \"root-account-create-update-ktl2b\" (UID: \"2164f9d1-1d8b-486b-beca-0d3a5172b302\") " pod="openstack/root-account-create-update-ktl2b" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.085245 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10e13a52-b0f3-447a-b47e-2c4dd50d6400-operator-scripts\") pod \"barbican-d602-account-create-update-lcd8k\" (UID: \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\") " pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.085294 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts\") pod \"root-account-create-update-ktl2b\" (UID: \"2164f9d1-1d8b-486b-beca-0d3a5172b302\") " pod="openstack/root-account-create-update-ktl2b" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.086075 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts\") pod \"root-account-create-update-ktl2b\" (UID: \"2164f9d1-1d8b-486b-beca-0d3a5172b302\") " pod="openstack/root-account-create-update-ktl2b" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.186501 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10e13a52-b0f3-447a-b47e-2c4dd50d6400-operator-scripts\") pod \"barbican-d602-account-create-update-lcd8k\" (UID: \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\") " pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.186650 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w65w5\" (UniqueName: \"kubernetes.io/projected/10e13a52-b0f3-447a-b47e-2c4dd50d6400-kube-api-access-w65w5\") pod \"barbican-d602-account-create-update-lcd8k\" (UID: \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\") " pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.187229 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10e13a52-b0f3-447a-b47e-2c4dd50d6400-operator-scripts\") pod \"barbican-d602-account-create-update-lcd8k\" (UID: \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\") " pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.191679 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kll6\" (UniqueName: \"kubernetes.io/projected/2164f9d1-1d8b-486b-beca-0d3a5172b302-kube-api-access-9kll6\") pod \"root-account-create-update-ktl2b\" (UID: \"2164f9d1-1d8b-486b-beca-0d3a5172b302\") " pod="openstack/root-account-create-update-ktl2b" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.199931 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c741-account-create-update-26ljt"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.201098 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.210249 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.237119 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w65w5\" (UniqueName: \"kubernetes.io/projected/10e13a52-b0f3-447a-b47e-2c4dd50d6400-kube-api-access-w65w5\") pod \"barbican-d602-account-create-update-lcd8k\" (UID: \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\") " pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.291952 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c741-account-create-update-26ljt"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.314288 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.314527 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="ovn-northd" containerID="cri-o://2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f" gracePeriod=30 Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.314571 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="openstack-network-exporter" containerID="cri-o://1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3" gracePeriod=30 Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.347841 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.398348 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e65bdd0-b6ac-406d-bc79-ade76397295e-operator-scripts\") pod \"placement-c741-account-create-update-26ljt\" (UID: \"3e65bdd0-b6ac-406d-bc79-ade76397295e\") " pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.398442 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzgh4\" (UniqueName: \"kubernetes.io/projected/3e65bdd0-b6ac-406d-bc79-ade76397295e-kube-api-access-mzgh4\") pod \"placement-c741-account-create-update-26ljt\" (UID: \"3e65bdd0-b6ac-406d-bc79-ade76397295e\") " pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.405228 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9f51-account-create-update-z87p8"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.406633 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.418104 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.439353 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f769-account-create-update-8k7r2"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.440621 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.452701 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ktl2b" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.453213 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.481490 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9f51-account-create-update-z87p8"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.499857 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f769-account-create-update-8k7r2"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.502034 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e65bdd0-b6ac-406d-bc79-ade76397295e-operator-scripts\") pod \"placement-c741-account-create-update-26ljt\" (UID: \"3e65bdd0-b6ac-406d-bc79-ade76397295e\") " pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.502120 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzgh4\" (UniqueName: \"kubernetes.io/projected/3e65bdd0-b6ac-406d-bc79-ade76397295e-kube-api-access-mzgh4\") pod \"placement-c741-account-create-update-26ljt\" (UID: \"3e65bdd0-b6ac-406d-bc79-ade76397295e\") " pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.503872 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e65bdd0-b6ac-406d-bc79-ade76397295e-operator-scripts\") pod \"placement-c741-account-create-update-26ljt\" (UID: \"3e65bdd0-b6ac-406d-bc79-ade76397295e\") " pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:54 crc kubenswrapper[4795]: E0219 21:49:54.503891 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 19 21:49:54 crc kubenswrapper[4795]: E0219 21:49:54.503946 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data podName:ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d nodeName:}" failed. No retries permitted until 2026-02-19 21:49:55.003929475 +0000 UTC m=+1306.196447339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data") pod "rabbitmq-cell1-server-0" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d") : configmap "rabbitmq-cell1-config-data" not found Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.528546 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-b4bcd"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.529547 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.535904 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-b4bcd"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.544754 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c741-account-create-update-hdlzx"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.575188 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzgh4\" (UniqueName: \"kubernetes.io/projected/3e65bdd0-b6ac-406d-bc79-ade76397295e-kube-api-access-mzgh4\") pod \"placement-c741-account-create-update-26ljt\" (UID: \"3e65bdd0-b6ac-406d-bc79-ade76397295e\") " pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.593479 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c741-account-create-update-hdlzx"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.601036 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.604305 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0500ca0-0cef-4b76-9c78-cb2189b520ff-operator-scripts\") pod \"cinder-9f51-account-create-update-z87p8\" (UID: \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\") " pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.605736 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc6xc\" (UniqueName: \"kubernetes.io/projected/b0500ca0-0cef-4b76-9c78-cb2189b520ff-kube-api-access-pc6xc\") pod \"cinder-9f51-account-create-update-z87p8\" (UID: \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\") " pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.605944 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hkps\" (UniqueName: \"kubernetes.io/projected/8eaa69df-d563-4dc0-8a78-40413946cbca-kube-api-access-5hkps\") pod \"glance-f769-account-create-update-8k7r2\" (UID: \"8eaa69df-d563-4dc0-8a78-40413946cbca\") " pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.606156 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eaa69df-d563-4dc0-8a78-40413946cbca-operator-scripts\") pod \"glance-f769-account-create-update-8k7r2\" (UID: \"8eaa69df-d563-4dc0-8a78-40413946cbca\") " pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.660332 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9f51-account-create-update-n57zq"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.678553 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9f51-account-create-update-n57zq"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.715285 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eaa69df-d563-4dc0-8a78-40413946cbca-operator-scripts\") pod \"glance-f769-account-create-update-8k7r2\" (UID: \"8eaa69df-d563-4dc0-8a78-40413946cbca\") " pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.715389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0500ca0-0cef-4b76-9c78-cb2189b520ff-operator-scripts\") pod \"cinder-9f51-account-create-update-z87p8\" (UID: \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\") " pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.715554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc6xc\" (UniqueName: \"kubernetes.io/projected/b0500ca0-0cef-4b76-9c78-cb2189b520ff-kube-api-access-pc6xc\") pod \"cinder-9f51-account-create-update-z87p8\" (UID: \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\") " pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.715622 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hkps\" (UniqueName: \"kubernetes.io/projected/8eaa69df-d563-4dc0-8a78-40413946cbca-kube-api-access-5hkps\") pod \"glance-f769-account-create-update-8k7r2\" (UID: \"8eaa69df-d563-4dc0-8a78-40413946cbca\") " pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.716283 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0500ca0-0cef-4b76-9c78-cb2189b520ff-operator-scripts\") pod \"cinder-9f51-account-create-update-z87p8\" (UID: \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\") " pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.766295 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eaa69df-d563-4dc0-8a78-40413946cbca-operator-scripts\") pod \"glance-f769-account-create-update-8k7r2\" (UID: \"8eaa69df-d563-4dc0-8a78-40413946cbca\") " pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.767843 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.768718 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerName="openstack-network-exporter" containerID="cri-o://9afa6d2d9c1c3abf2795f79e56d0f2c85700d5b1b8b011dec52725e8bbc63d6b" gracePeriod=300 Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.851325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hkps\" (UniqueName: \"kubernetes.io/projected/8eaa69df-d563-4dc0-8a78-40413946cbca-kube-api-access-5hkps\") pod \"glance-f769-account-create-update-8k7r2\" (UID: \"8eaa69df-d563-4dc0-8a78-40413946cbca\") " pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.877084 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc6xc\" (UniqueName: \"kubernetes.io/projected/b0500ca0-0cef-4b76-9c78-cb2189b520ff-kube-api-access-pc6xc\") pod \"cinder-9f51-account-create-update-z87p8\" (UID: \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\") " pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.905518 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f769-account-create-update-25m5x"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.905663 4795 generic.go:334] "Generic (PLEG): container finished" podID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerID="1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3" exitCode=2 Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.905690 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c","Type":"ContainerDied","Data":"1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3"} Feb 19 21:49:54 crc kubenswrapper[4795]: E0219 21:49:54.944203 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.944243 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f769-account-create-update-25m5x"] Feb 19 21:49:54 crc kubenswrapper[4795]: E0219 21:49:54.972336 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 19 21:49:54 crc kubenswrapper[4795]: E0219 21:49:54.978463 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 19 21:49:54 crc kubenswrapper[4795]: E0219 21:49:54.978527 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="ovn-northd" Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.995338 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerName="ovsdbserver-nb" containerID="cri-o://146b7776ad30bb62917c430a4cb976669fc9f5db740f7f370033e5be6e16f033" gracePeriod=300 Feb 19 21:49:54 crc kubenswrapper[4795]: I0219 21:49:54.997875 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.020529 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.021263 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerName="openstack-network-exporter" containerID="cri-o://e932ce1114c0c3a49c8f6332f06a4a9aadb5f1382200346f37f0e3e9fe2d3373" gracePeriod=300 Feb 19 21:49:55 crc kubenswrapper[4795]: E0219 21:49:55.049035 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 19 21:49:55 crc kubenswrapper[4795]: E0219 21:49:55.049118 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data podName:ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d nodeName:}" failed. No retries permitted until 2026-02-19 21:49:56.04909626 +0000 UTC m=+1307.241614124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data") pod "rabbitmq-cell1-server-0" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d") : configmap "rabbitmq-cell1-config-data" not found Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.061648 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2d62-account-create-update-4vs7v"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.066854 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.068395 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.069753 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.091506 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-ttz5x"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.132222 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2d62-account-create-update-jrx2c"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.169342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xc68\" (UniqueName: \"kubernetes.io/projected/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-kube-api-access-4xc68\") pod \"nova-api-2d62-account-create-update-4vs7v\" (UID: \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\") " pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.169525 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-operator-scripts\") pod \"nova-api-2d62-account-create-update-4vs7v\" (UID: \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\") " pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.184986 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2d62-account-create-update-jrx2c"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.232683 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-ttz5x"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.255994 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.272191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-operator-scripts\") pod \"nova-api-2d62-account-create-update-4vs7v\" (UID: \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\") " pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.272336 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xc68\" (UniqueName: \"kubernetes.io/projected/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-kube-api-access-4xc68\") pod \"nova-api-2d62-account-create-update-4vs7v\" (UID: \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\") " pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.273102 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-operator-scripts\") pod \"nova-api-2d62-account-create-update-4vs7v\" (UID: \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\") " pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.276565 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2d62-account-create-update-4vs7v"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.297390 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7677694455-vk4hv"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.297644 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7677694455-vk4hv" podUID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerName="dnsmasq-dns" containerID="cri-o://0b266e887f3cb16fe191b1e64cc8d8adf7464d874863c315ba4605ce8d79b678" gracePeriod=10 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.310009 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerName="ovsdbserver-sb" containerID="cri-o://fc2aa4e6ca186f0a3259cef3b73108248f8b30394d6e7f6ee3356a21235bd96b" gracePeriod=300 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.330141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xc68\" (UniqueName: \"kubernetes.io/projected/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-kube-api-access-4xc68\") pod \"nova-api-2d62-account-create-update-4vs7v\" (UID: \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\") " pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.353454 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jkspq"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.373990 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jkspq"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.399622 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-zjbsw"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.423348 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-zjbsw"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.454229 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-1922-account-create-update-gzkhl"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.455382 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.463323 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.464792 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 19 21:49:55 crc kubenswrapper[4795]: E0219 21:49:55.483675 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 19 21:49:55 crc kubenswrapper[4795]: E0219 21:49:55.493308 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data podName:7b096325-542d-4ac6-8d16-8aa0937013b2 nodeName:}" failed. No retries permitted until 2026-02-19 21:49:55.993274627 +0000 UTC m=+1307.185792491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data") pod "rabbitmq-server-0" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2") : configmap "rabbitmq-config-data" not found Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.496856 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-n77q8"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.498152 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.500774 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.510545 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-gzkhl"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.586573 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3fde95-91bf-4f6a-9753-f879d56fedbb-operator-scripts\") pod \"nova-cell0-1922-account-create-update-gzkhl\" (UID: \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\") " pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.586623 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq9sz\" (UniqueName: \"kubernetes.io/projected/ee3fde95-91bf-4f6a-9753-f879d56fedbb-kube-api-access-vq9sz\") pod \"nova-cell0-1922-account-create-update-gzkhl\" (UID: \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\") " pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.699539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8sbw\" (UniqueName: \"kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw\") pod \"nova-cell1-e48f-account-create-update-n77q8\" (UID: \"7400eda6-e731-4942-b002-c81dd9a87e6a\") " pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.699670 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3fde95-91bf-4f6a-9753-f879d56fedbb-operator-scripts\") pod \"nova-cell0-1922-account-create-update-gzkhl\" (UID: \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\") " pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.699717 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq9sz\" (UniqueName: \"kubernetes.io/projected/ee3fde95-91bf-4f6a-9753-f879d56fedbb-kube-api-access-vq9sz\") pod \"nova-cell0-1922-account-create-update-gzkhl\" (UID: \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\") " pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.699735 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts\") pod \"nova-cell1-e48f-account-create-update-n77q8\" (UID: \"7400eda6-e731-4942-b002-c81dd9a87e6a\") " pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.701348 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3fde95-91bf-4f6a-9753-f879d56fedbb-operator-scripts\") pod \"nova-cell0-1922-account-create-update-gzkhl\" (UID: \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\") " pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.719186 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq9sz\" (UniqueName: \"kubernetes.io/projected/ee3fde95-91bf-4f6a-9753-f879d56fedbb-kube-api-access-vq9sz\") pod \"nova-cell0-1922-account-create-update-gzkhl\" (UID: \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\") " pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.796300 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec" path="/var/lib/kubelet/pods/1c0c6e8d-ad10-4dd4-8b01-2be1968f4bec/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.798197 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="282595b2-0eaa-4deb-9af4-288241817325" path="/var/lib/kubelet/pods/282595b2-0eaa-4deb-9af4-288241817325/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.798767 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="454af6b2-4c9e-4706-a537-b3e3d468353d" path="/var/lib/kubelet/pods/454af6b2-4c9e-4706-a537-b3e3d468353d/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.801647 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8sbw\" (UniqueName: \"kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw\") pod \"nova-cell1-e48f-account-create-update-n77q8\" (UID: \"7400eda6-e731-4942-b002-c81dd9a87e6a\") " pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.801794 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts\") pod \"nova-cell1-e48f-account-create-update-n77q8\" (UID: \"7400eda6-e731-4942-b002-c81dd9a87e6a\") " pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:55 crc kubenswrapper[4795]: E0219 21:49:55.801917 4795 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 19 21:49:55 crc kubenswrapper[4795]: E0219 21:49:55.801981 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts podName:7400eda6-e731-4942-b002-c81dd9a87e6a nodeName:}" failed. No retries permitted until 2026-02-19 21:49:56.301965383 +0000 UTC m=+1307.494483247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts") pod "nova-cell1-e48f-account-create-update-n77q8" (UID: "7400eda6-e731-4942-b002-c81dd9a87e6a") : configmap "openstack-cell1-scripts" not found Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.803814 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5084e7b9-4923-449e-b0d7-28c602faeff0" path="/var/lib/kubelet/pods/5084e7b9-4923-449e-b0d7-28c602faeff0/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.804842 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="541fd524-94f2-4149-b16b-ab11a716ff95" path="/var/lib/kubelet/pods/541fd524-94f2-4149-b16b-ab11a716ff95/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.805505 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57961551-d4f8-4586-b255-8810fbdb499a" path="/var/lib/kubelet/pods/57961551-d4f8-4586-b255-8810fbdb499a/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: E0219 21:49:55.805838 4795 projected.go:194] Error preparing data for projected volume kube-api-access-z8sbw for pod openstack/nova-cell1-e48f-account-create-update-n77q8: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:49:55 crc kubenswrapper[4795]: E0219 21:49:55.805869 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw podName:7400eda6-e731-4942-b002-c81dd9a87e6a nodeName:}" failed. No retries permitted until 2026-02-19 21:49:56.305858143 +0000 UTC m=+1307.498376007 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z8sbw" (UniqueName: "kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw") pod "nova-cell1-e48f-account-create-update-n77q8" (UID: "7400eda6-e731-4942-b002-c81dd9a87e6a") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.812191 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b15ba11-a170-4fac-bac1-15ecf9de7379" path="/var/lib/kubelet/pods/5b15ba11-a170-4fac-bac1-15ecf9de7379/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.812723 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d152069-2c3d-4cf4-94e8-3068e24def9f" path="/var/lib/kubelet/pods/7d152069-2c3d-4cf4-94e8-3068e24def9f/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.813467 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="890a044b-0060-4feb-866b-9a9e80bfa706" path="/var/lib/kubelet/pods/890a044b-0060-4feb-866b-9a9e80bfa706/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.814029 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e61f40e0-d6c3-49f7-a93f-d9956f086d4b" path="/var/lib/kubelet/pods/e61f40e0-d6c3-49f7-a93f-d9956f086d4b/volumes" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834738 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-n77q8"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834772 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-2wbff"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834788 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-2wbff"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834807 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tl5hf"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834826 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-w9fbs"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834853 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-bnqt2"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834870 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-bnqt2"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834881 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-48v7f"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834893 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-48v7f"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834912 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-p9cs4"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834924 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-576c65f985-r97z7"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834940 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ktl2b"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834961 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834973 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6df95dfbd4-ftf6x"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834984 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-xmbg2"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.834995 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-xmbg2"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.835320 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6df95dfbd4-ftf6x" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerName="placement-log" containerID="cri-o://16d1434f729c32f7f5af098d1664dafb8ed3d4636079462bf6c45b1454ee08ef" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.835517 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6df95dfbd4-ftf6x" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerName="placement-api" containerID="cri-o://9d2e881624f7800e5e6e027e5487084188713e96e14cab5c281c53ae829851a2" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841475 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-server" containerID="cri-o://51c55baa52f08bcb95276c0f7a67a7ef348b9bd02a9fc401f50e679f37e0c117" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841595 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="swift-recon-cron" containerID="cri-o://955740cbd5ac4eda735378957980240001d5c0ce0905f2fca18b4155f3fb6c98" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841668 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="rsync" containerID="cri-o://bb960073c3b2955d7aa2d18d3eb2e0958e7e98f4cd499d7077f5064d1e43a05e" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841747 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-expirer" containerID="cri-o://8d3a569ab5140e595996d2c82fd170ed28aa9420de4fdae36e9b5854b2e0bd5e" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841804 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-updater" containerID="cri-o://2ce7f7a343a6c79fd57a6b4c7cea8f6f21ccfbc5ea5261b4c592c5cc2035910e" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841866 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-auditor" containerID="cri-o://cb0176a835c07bb843ea9834f19b5792b6d9700c5cd61a140ec8b99a66854f5f" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841917 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-replicator" containerID="cri-o://c8d438309dbe4ee742eb9d7a2b93e755a74c9ba2dd39409dcf7caf84dee6405a" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841993 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-server" containerID="cri-o://c5d0640985105b2140d43ecf956f5621f6e82eb5a3d40d95fb3d09d303406c84" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.841980 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-p9cs4" podUID="eee0ea5d-4b43-4421-b23e-555c5eac3564" containerName="openstack-network-exporter" containerID="cri-o://3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842052 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-updater" containerID="cri-o://f2b55f40d9ab92e06fdc09f65e72764f7f9c63fbb1f126ede2058624236d001f" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842095 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-auditor" containerID="cri-o://5019fca913d092cd1d004e058553c364ac08007bafeae027b681e3bb6eb59026" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842188 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-replicator" containerID="cri-o://be1c394688c447ed772b4929317159025a1e97491b40b847644ed369351532b5" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842234 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-576c65f985-r97z7" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-api" containerID="cri-o://b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842244 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-server" containerID="cri-o://a86db1a02ddab5086097179b35e6f17d71c32f36157625ee70912b23839603d4" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842293 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-reaper" containerID="cri-o://d2281de4777acfa86c800d06ed4c2e0ac8613cf4008b8449cd7089d057ee51ec" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842336 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-auditor" containerID="cri-o://22441008e17545864d7c6366d4ab4fa8333a1c04e36b9961e8fdfdfaeec8b1b6" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842386 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-replicator" containerID="cri-o://b4b0474dd3a4192273fa0b2dc273a792e955cd0dde33e24a1afa65bb56656eaa" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.842499 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-576c65f985-r97z7" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-httpd" containerID="cri-o://fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec" gracePeriod=30 Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.885537 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.939794 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-x4mls"] Feb 19 21:49:55 crc kubenswrapper[4795]: I0219 21:49:55.957712 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-8jt8c"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.015464 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-x4mls"] Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.023091 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.023188 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data podName:7b096325-542d-4ac6-8d16-8aa0937013b2 nodeName:}" failed. No retries permitted until 2026-02-19 21:49:57.023142641 +0000 UTC m=+1308.215660505 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data") pod "rabbitmq-server-0" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2") : configmap "rabbitmq-config-data" not found Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.024072 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-8jt8c"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.029228 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ktl2b" event={"ID":"2164f9d1-1d8b-486b-beca-0d3a5172b302","Type":"ContainerStarted","Data":"0bc8d13f4092138cc363d9e77ad1f35f49f21dad6c940b0ffcd7de9f24d779fb"} Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.052446 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3c2bcb9c-07d3-4d71-924b-aacd537e3430/ovsdbserver-sb/0.log" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.052501 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerID="e932ce1114c0c3a49c8f6332f06a4a9aadb5f1382200346f37f0e3e9fe2d3373" exitCode=2 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.052520 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerID="fc2aa4e6ca186f0a3259cef3b73108248f8b30394d6e7f6ee3356a21235bd96b" exitCode=143 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.053360 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-w8x5b"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.053393 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3c2bcb9c-07d3-4d71-924b-aacd537e3430","Type":"ContainerDied","Data":"e932ce1114c0c3a49c8f6332f06a4a9aadb5f1382200346f37f0e3e9fe2d3373"} Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.053414 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3c2bcb9c-07d3-4d71-924b-aacd537e3430","Type":"ContainerDied","Data":"fc2aa4e6ca186f0a3259cef3b73108248f8b30394d6e7f6ee3356a21235bd96b"} Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.062894 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-w8x5b"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.068583 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3c5a8678-8ce2-4bee-9160-37b1dea9f897/ovsdbserver-nb/0.log" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.068658 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerID="9afa6d2d9c1c3abf2795f79e56d0f2c85700d5b1b8b011dec52725e8bbc63d6b" exitCode=2 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.068678 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerID="146b7776ad30bb62917c430a4cb976669fc9f5db740f7f370033e5be6e16f033" exitCode=143 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.068907 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3c5a8678-8ce2-4bee-9160-37b1dea9f897","Type":"ContainerDied","Data":"9afa6d2d9c1c3abf2795f79e56d0f2c85700d5b1b8b011dec52725e8bbc63d6b"} Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.068937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3c5a8678-8ce2-4bee-9160-37b1dea9f897","Type":"ContainerDied","Data":"146b7776ad30bb62917c430a4cb976669fc9f5db740f7f370033e5be6e16f033"} Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.077364 4795 generic.go:334] "Generic (PLEG): container finished" podID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerID="0b266e887f3cb16fe191b1e64cc8d8adf7464d874863c315ba4605ce8d79b678" exitCode=0 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.077415 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-vk4hv" event={"ID":"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190","Type":"ContainerDied","Data":"0b266e887f3cb16fe191b1e64cc8d8adf7464d874863c315ba4605ce8d79b678"} Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.079323 4795 generic.go:334] "Generic (PLEG): container finished" podID="336beec4-e534-448f-8367-78645b53650e" containerID="6f5fd7fb0db869022abdd3586a7debaf90502df56c7437b86541c8afbbd3687a" exitCode=137 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.079421 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.128987 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-config\") pod \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.129071 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-swift-storage-0\") pod \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.129091 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8xm5\" (UniqueName: \"kubernetes.io/projected/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-kube-api-access-s8xm5\") pod \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.138744 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-svc\") pod \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.138791 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-sb\") pod \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.138819 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-nb\") pod \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\" (UID: \"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190\") " Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.139624 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.139675 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data podName:ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d nodeName:}" failed. No retries permitted until 2026-02-19 21:49:58.139659587 +0000 UTC m=+1309.332177451 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data") pod "rabbitmq-cell1-server-0" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d") : configmap "rabbitmq-cell1-config-data" not found Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.170254 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-kube-api-access-s8xm5" (OuterVolumeSpecName: "kube-api-access-s8xm5") pod "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" (UID: "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190"). InnerVolumeSpecName "kube-api-access-s8xm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.244272 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.246919 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerName="cinder-api-log" containerID="cri-o://46487241a29d4cc3bff33a03b2f13ce2e328740a30388d55d6c987233cf2d399" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.247115 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8xm5\" (UniqueName: \"kubernetes.io/projected/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-kube-api-access-s8xm5\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.247286 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerName="cinder-api" containerID="cri-o://067a3784bb90d910b6b73dca0dc993d5c4844e46f49fddffcbfe6f467e1645d3" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.256461 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c93-account-create-update-ptrqq"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.265124 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7c93-account-create-update-ptrqq"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.274951 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.275841 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerName="cinder-scheduler" containerID="cri-o://5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.276633 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerName="probe" containerID="cri-o://b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.303638 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-vlmnn"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.309177 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-config" (OuterVolumeSpecName: "config") pod "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" (UID: "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.326881 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-vlmnn"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.331845 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.352457 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d602-account-create-update-lcd8k"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.354324 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.354416 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.356312 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts\") pod \"nova-cell1-e48f-account-create-update-n77q8\" (UID: \"7400eda6-e731-4942-b002-c81dd9a87e6a\") " pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.356473 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8sbw\" (UniqueName: \"kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw\") pod \"nova-cell1-e48f-account-create-update-n77q8\" (UID: \"7400eda6-e731-4942-b002-c81dd9a87e6a\") " pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.348825 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" (UID: "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.354750 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerName="glance-log" containerID="cri-o://e797be576a87ea7d2cd1a10d4fb93c6e0f25a6a5bebf1abb85c7b6e12aa13e38" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.354767 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerName="glance-httpd" containerID="cri-o://c6abf78f9f811ce98af3de204165d6af86666923e425da520b7a47fdc3944ee7" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.353891 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerName="glance-httpd" containerID="cri-o://708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.353518 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerName="glance-log" containerID="cri-o://46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.357420 4795 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.358511 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts podName:7400eda6-e731-4942-b002-c81dd9a87e6a nodeName:}" failed. No retries permitted until 2026-02-19 21:49:57.358488629 +0000 UTC m=+1308.551006493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts") pod "nova-cell1-e48f-account-create-update-n77q8" (UID: "7400eda6-e731-4942-b002-c81dd9a87e6a") : configmap "openstack-cell1-scripts" not found Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.358788 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.358844 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.359996 4795 projected.go:194] Error preparing data for projected volume kube-api-access-z8sbw for pod openstack/nova-cell1-e48f-account-create-update-n77q8: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.360106 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw podName:7400eda6-e731-4942-b002-c81dd9a87e6a nodeName:}" failed. No retries permitted until 2026-02-19 21:49:57.360092134 +0000 UTC m=+1308.552609998 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-z8sbw" (UniqueName: "kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw") pod "nova-cell1-e48f-account-create-update-n77q8" (UID: "7400eda6-e731-4942-b002-c81dd9a87e6a") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.362958 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-lqp9l"] Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.391433 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:49:56 crc kubenswrapper[4795]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: if [ -n "barbican" ]; then Feb 19 21:49:56 crc kubenswrapper[4795]: GRANT_DATABASE="barbican" Feb 19 21:49:56 crc kubenswrapper[4795]: else Feb 19 21:49:56 crc kubenswrapper[4795]: GRANT_DATABASE="*" Feb 19 21:49:56 crc kubenswrapper[4795]: fi Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: # going for maximum compatibility here: Feb 19 21:49:56 crc kubenswrapper[4795]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:49:56 crc kubenswrapper[4795]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:49:56 crc kubenswrapper[4795]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:49:56 crc kubenswrapper[4795]: # support updates Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.391498 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-lqp9l"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.393014 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" (UID: "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.393210 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-d602-account-create-update-lcd8k" podUID="10e13a52-b0f3-447a-b47e-2c4dd50d6400" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.431804 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" (UID: "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.456667 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c741-account-create-update-26ljt"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.460147 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.460188 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.472725 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-snb69"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.472802 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerName="rabbitmq" containerID="cri-o://5a6b19520891e7087129c9dfe002592444a956d213365be36d88dc721e7adc6e" gracePeriod=604800 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.482515 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7cd95cf589-2gw48"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.482903 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7cd95cf589-2gw48" podUID="250e9cae-06d9-44da-88af-239d15356a3c" containerName="barbican-worker-log" containerID="cri-o://0d1ad96e830846d9034790e753be1811679f1bbf15af5f12e70081c7ae374cfd" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.483842 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7cd95cf589-2gw48" podUID="250e9cae-06d9-44da-88af-239d15356a3c" containerName="barbican-worker" containerID="cri-o://f1227cc577cca2da8d7067560947f94ac089651b67e263368202e928196a7bc8" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.496425 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-snb69"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.503391 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9f51-account-create-update-z87p8"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.510336 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" (UID: "74df8ac0-77f2-4e8d-aa39-d05dc6ce7190"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.516144 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f98bf9994-pr48x"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.516711 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f98bf9994-pr48x" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api-log" containerID="cri-o://d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.517187 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f98bf9994-pr48x" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api" containerID="cri-o://a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.529385 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-7ktnd"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.565322 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.570803 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-7ktnd"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.576094 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3c5a8678-8ce2-4bee-9160-37b1dea9f897/ovsdbserver-nb/0.log" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.576150 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.600945 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3c2bcb9c-07d3-4d71-924b-aacd537e3430/ovsdbserver-sb/0.log" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.601004 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.606982 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" containerID="cri-o://ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.608398 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:49:56 crc kubenswrapper[4795]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: if [ -n "placement" ]; then Feb 19 21:49:56 crc kubenswrapper[4795]: GRANT_DATABASE="placement" Feb 19 21:49:56 crc kubenswrapper[4795]: else Feb 19 21:49:56 crc kubenswrapper[4795]: GRANT_DATABASE="*" Feb 19 21:49:56 crc kubenswrapper[4795]: fi Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: # going for maximum compatibility here: Feb 19 21:49:56 crc kubenswrapper[4795]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:49:56 crc kubenswrapper[4795]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:49:56 crc kubenswrapper[4795]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:49:56 crc kubenswrapper[4795]: # support updates Feb 19 21:49:56 crc kubenswrapper[4795]: Feb 19 21:49:56 crc kubenswrapper[4795]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.612887 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-c741-account-create-update-26ljt" podUID="3e65bdd0-b6ac-406d-bc79-ade76397295e" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.614217 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6b6d98fbd4-svzc8"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.614506 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerName="barbican-keystone-listener-log" containerID="cri-o://374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.614633 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerName="barbican-keystone-listener" containerID="cri-o://b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.620771 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.661651 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f769-account-create-update-8k7r2"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.669750 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.670074 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-log" containerID="cri-o://a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.670247 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-metadata" containerID="cri-o://11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.693333 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2d62-account-create-update-4vs7v"] Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.696509 4795 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 19 21:49:56 crc kubenswrapper[4795]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 19 21:49:56 crc kubenswrapper[4795]: + source /usr/local/bin/container-scripts/functions Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNBridge=br-int Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNRemote=tcp:localhost:6642 Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNEncapType=geneve Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNAvailabilityZones= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ EnableChassisAsGateway=true Feb 19 21:49:56 crc kubenswrapper[4795]: ++ PhysicalNetworks= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNHostName= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 19 21:49:56 crc kubenswrapper[4795]: ++ ovs_dir=/var/lib/openvswitch Feb 19 21:49:56 crc kubenswrapper[4795]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 19 21:49:56 crc kubenswrapper[4795]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 19 21:49:56 crc kubenswrapper[4795]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:49:56 crc kubenswrapper[4795]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:49:56 crc kubenswrapper[4795]: + sleep 0.5 Feb 19 21:49:56 crc kubenswrapper[4795]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:49:56 crc kubenswrapper[4795]: + cleanup_ovsdb_server_semaphore Feb 19 21:49:56 crc kubenswrapper[4795]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:49:56 crc kubenswrapper[4795]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 19 21:49:56 crc kubenswrapper[4795]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-tl5hf" message=< Feb 19 21:49:56 crc kubenswrapper[4795]: Exiting ovsdb-server (5) [ OK ] Feb 19 21:49:56 crc kubenswrapper[4795]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 19 21:49:56 crc kubenswrapper[4795]: + source /usr/local/bin/container-scripts/functions Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNBridge=br-int Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNRemote=tcp:localhost:6642 Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNEncapType=geneve Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNAvailabilityZones= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ EnableChassisAsGateway=true Feb 19 21:49:56 crc kubenswrapper[4795]: ++ PhysicalNetworks= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNHostName= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 19 21:49:56 crc kubenswrapper[4795]: ++ ovs_dir=/var/lib/openvswitch Feb 19 21:49:56 crc kubenswrapper[4795]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 19 21:49:56 crc kubenswrapper[4795]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 19 21:49:56 crc kubenswrapper[4795]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:49:56 crc kubenswrapper[4795]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:49:56 crc kubenswrapper[4795]: + sleep 0.5 Feb 19 21:49:56 crc kubenswrapper[4795]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:49:56 crc kubenswrapper[4795]: + cleanup_ovsdb_server_semaphore Feb 19 21:49:56 crc kubenswrapper[4795]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:49:56 crc kubenswrapper[4795]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 19 21:49:56 crc kubenswrapper[4795]: > Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.696543 4795 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 19 21:49:56 crc kubenswrapper[4795]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 19 21:49:56 crc kubenswrapper[4795]: + source /usr/local/bin/container-scripts/functions Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNBridge=br-int Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNRemote=tcp:localhost:6642 Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNEncapType=geneve Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNAvailabilityZones= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ EnableChassisAsGateway=true Feb 19 21:49:56 crc kubenswrapper[4795]: ++ PhysicalNetworks= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ OVNHostName= Feb 19 21:49:56 crc kubenswrapper[4795]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 19 21:49:56 crc kubenswrapper[4795]: ++ ovs_dir=/var/lib/openvswitch Feb 19 21:49:56 crc kubenswrapper[4795]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 19 21:49:56 crc kubenswrapper[4795]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 19 21:49:56 crc kubenswrapper[4795]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:49:56 crc kubenswrapper[4795]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:49:56 crc kubenswrapper[4795]: + sleep 0.5 Feb 19 21:49:56 crc kubenswrapper[4795]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 19 21:49:56 crc kubenswrapper[4795]: + cleanup_ovsdb_server_semaphore Feb 19 21:49:56 crc kubenswrapper[4795]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 19 21:49:56 crc kubenswrapper[4795]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 19 21:49:56 crc kubenswrapper[4795]: > pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" containerID="cri-o://e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.696575 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" containerID="cri-o://e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.705883 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.718683 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.718960 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-log" containerID="cri-o://4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.719631 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-api" containerID="cri-o://5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.741108 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-h72xz"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.756781 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-gzkhl"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.768895 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdb-rundir\") pod \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.769646 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "3c5a8678-8ce2-4bee-9160-37b1dea9f897" (UID: "3c5a8678-8ce2-4bee-9160-37b1dea9f897"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.768942 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdbserver-nb-tls-certs\") pod \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.769928 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-config\") pod \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.769955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbtv2\" (UniqueName: \"kubernetes.io/projected/336beec4-e534-448f-8367-78645b53650e-kube-api-access-bbtv2\") pod \"336beec4-e534-448f-8367-78645b53650e\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770035 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-metrics-certs-tls-certs\") pod \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770099 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-combined-ca-bundle\") pod \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770126 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdb-rundir\") pod \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770155 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czm58\" (UniqueName: \"kubernetes.io/projected/3c5a8678-8ce2-4bee-9160-37b1dea9f897-kube-api-access-czm58\") pod \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770280 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-metrics-certs-tls-certs\") pod \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770302 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770329 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/336beec4-e534-448f-8367-78645b53650e-openstack-config\") pod \"336beec4-e534-448f-8367-78645b53650e\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770367 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-config\") pod \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770388 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-scripts\") pod \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770440 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770477 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-combined-ca-bundle\") pod \"336beec4-e534-448f-8367-78645b53650e\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770507 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-combined-ca-bundle\") pod \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770523 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdbserver-sb-tls-certs\") pod \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770551 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwcsv\" (UniqueName: \"kubernetes.io/projected/3c2bcb9c-07d3-4d71-924b-aacd537e3430-kube-api-access-fwcsv\") pod \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\" (UID: \"3c2bcb9c-07d3-4d71-924b-aacd537e3430\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770573 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-openstack-config-secret\") pod \"336beec4-e534-448f-8367-78645b53650e\" (UID: \"336beec4-e534-448f-8367-78645b53650e\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770598 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-scripts\") pod \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\" (UID: \"3c5a8678-8ce2-4bee-9160-37b1dea9f897\") " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.770996 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-config" (OuterVolumeSpecName: "config") pod "3c5a8678-8ce2-4bee-9160-37b1dea9f897" (UID: "3c5a8678-8ce2-4bee-9160-37b1dea9f897"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.772271 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.772309 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.774443 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-config" (OuterVolumeSpecName: "config") pod "3c2bcb9c-07d3-4d71-924b-aacd537e3430" (UID: "3c2bcb9c-07d3-4d71-924b-aacd537e3430"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.776407 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-h72xz"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.811730 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "3c2bcb9c-07d3-4d71-924b-aacd537e3430" (UID: "3c2bcb9c-07d3-4d71-924b-aacd537e3430"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.820212 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c2bcb9c-07d3-4d71-924b-aacd537e3430-kube-api-access-fwcsv" (OuterVolumeSpecName: "kube-api-access-fwcsv") pod "3c2bcb9c-07d3-4d71-924b-aacd537e3430" (UID: "3c2bcb9c-07d3-4d71-924b-aacd537e3430"). InnerVolumeSpecName "kube-api-access-fwcsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.828951 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-scripts" (OuterVolumeSpecName: "scripts") pod "3c2bcb9c-07d3-4d71-924b-aacd537e3430" (UID: "3c2bcb9c-07d3-4d71-924b-aacd537e3430"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.829322 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-p7s8n"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.828617 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "3c2bcb9c-07d3-4d71-924b-aacd537e3430" (UID: "3c2bcb9c-07d3-4d71-924b-aacd537e3430"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.830805 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-scripts" (OuterVolumeSpecName: "scripts") pod "3c5a8678-8ce2-4bee-9160-37b1dea9f897" (UID: "3c5a8678-8ce2-4bee-9160-37b1dea9f897"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.878288 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.878345 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c2bcb9c-07d3-4d71-924b-aacd537e3430-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.878371 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.878401 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwcsv\" (UniqueName: \"kubernetes.io/projected/3c2bcb9c-07d3-4d71-924b-aacd537e3430-kube-api-access-fwcsv\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.878412 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c5a8678-8ce2-4bee-9160-37b1dea9f897-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.878421 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.882910 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "3c5a8678-8ce2-4bee-9160-37b1dea9f897" (UID: "3c5a8678-8ce2-4bee-9160-37b1dea9f897"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.883061 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c5a8678-8ce2-4bee-9160-37b1dea9f897-kube-api-access-czm58" (OuterVolumeSpecName: "kube-api-access-czm58") pod "3c5a8678-8ce2-4bee-9160-37b1dea9f897" (UID: "3c5a8678-8ce2-4bee-9160-37b1dea9f897"). InnerVolumeSpecName "kube-api-access-czm58". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.883153 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/336beec4-e534-448f-8367-78645b53650e-kube-api-access-bbtv2" (OuterVolumeSpecName: "kube-api-access-bbtv2") pod "336beec4-e534-448f-8367-78645b53650e" (UID: "336beec4-e534-448f-8367-78645b53650e"). InnerVolumeSpecName "kube-api-access-bbtv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.931647 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c2bcb9c-07d3-4d71-924b-aacd537e3430" (UID: "3c2bcb9c-07d3-4d71-924b-aacd537e3430"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.932538 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-n77q8"] Feb 19 21:49:56 crc kubenswrapper[4795]: E0219 21:49:56.933282 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-z8sbw operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/nova-cell1-e48f-account-create-update-n77q8" podUID="7400eda6-e731-4942-b002-c81dd9a87e6a" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.939764 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/336beec4-e534-448f-8367-78645b53650e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "336beec4-e534-448f-8367-78645b53650e" (UID: "336beec4-e534-448f-8367-78645b53650e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.947091 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-r8v4f"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.951426 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-p7s8n"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.957381 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.965211 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-r8v4f"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.968338 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c5a8678-8ce2-4bee-9160-37b1dea9f897" (UID: "3c5a8678-8ce2-4bee-9160-37b1dea9f897"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.971955 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d602-account-create-update-lcd8k"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.972052 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "336beec4-e534-448f-8367-78645b53650e" (UID: "336beec4-e534-448f-8367-78645b53650e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.978555 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.978765 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0adadcd9-8949-443b-8042-d0d11191eae9" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f" gracePeriod=30 Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.980414 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.980433 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.980442 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.980450 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbtv2\" (UniqueName: \"kubernetes.io/projected/336beec4-e534-448f-8367-78645b53650e-kube-api-access-bbtv2\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.980458 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.980466 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czm58\" (UniqueName: \"kubernetes.io/projected/3c5a8678-8ce2-4bee-9160-37b1dea9f897-kube-api-access-czm58\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.980492 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.980503 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/336beec4-e534-448f-8367-78645b53650e-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:56 crc kubenswrapper[4795]: I0219 21:49:56.986981 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:56.995892 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c741-account-create-update-26ljt"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.011628 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.023943 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-p9cs4_eee0ea5d-4b43-4421-b23e-555c5eac3564/openstack-network-exporter/0.log" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.024007 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.025319 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "336beec4-e534-448f-8367-78645b53650e" (UID: "336beec4-e534-448f-8367-78645b53650e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.027922 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "3c2bcb9c-07d3-4d71-924b-aacd537e3430" (UID: "3c2bcb9c-07d3-4d71-924b-aacd537e3430"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.055344 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerName="rabbitmq" containerID="cri-o://65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a" gracePeriod=604800 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.055897 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gxh8d"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.060006 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "3c5a8678-8ce2-4bee-9160-37b1dea9f897" (UID: "3c5a8678-8ce2-4bee-9160-37b1dea9f897"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.063639 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:49:57 crc kubenswrapper[4795]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: if [ -n "cinder" ]; then Feb 19 21:49:57 crc kubenswrapper[4795]: GRANT_DATABASE="cinder" Feb 19 21:49:57 crc kubenswrapper[4795]: else Feb 19 21:49:57 crc kubenswrapper[4795]: GRANT_DATABASE="*" Feb 19 21:49:57 crc kubenswrapper[4795]: fi Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: # going for maximum compatibility here: Feb 19 21:49:57 crc kubenswrapper[4795]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:49:57 crc kubenswrapper[4795]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:49:57 crc kubenswrapper[4795]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:49:57 crc kubenswrapper[4795]: # support updates Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.064853 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-9f51-account-create-update-z87p8" podUID="b0500ca0-0cef-4b76-9c78-cb2189b520ff" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.066861 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.067450 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="f6fd7841-2a08-4786-8e96-b2ab0f477eff" containerName="nova-cell1-conductor-conductor" containerID="cri-o://4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3" gracePeriod=30 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.074466 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gxh8d"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.076925 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "3c5a8678-8ce2-4bee-9160-37b1dea9f897" (UID: "3c5a8678-8ce2-4bee-9160-37b1dea9f897"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.082383 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.082616 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="57b83043-2f7c-4b55-a2b9-66eef96f0008" containerName="nova-cell0-conductor-conductor" containerID="cri-o://4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6" gracePeriod=30 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.086600 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-combined-ca-bundle\") pod \"eee0ea5d-4b43-4421-b23e-555c5eac3564\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.086659 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-metrics-certs-tls-certs\") pod \"eee0ea5d-4b43-4421-b23e-555c5eac3564\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.086682 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee0ea5d-4b43-4421-b23e-555c5eac3564-config\") pod \"eee0ea5d-4b43-4421-b23e-555c5eac3564\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.086748 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovs-rundir\") pod \"eee0ea5d-4b43-4421-b23e-555c5eac3564\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.086805 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovn-rundir\") pod \"eee0ea5d-4b43-4421-b23e-555c5eac3564\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.086912 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wdlj\" (UniqueName: \"kubernetes.io/projected/eee0ea5d-4b43-4421-b23e-555c5eac3564-kube-api-access-6wdlj\") pod \"eee0ea5d-4b43-4421-b23e-555c5eac3564\" (UID: \"eee0ea5d-4b43-4421-b23e-555c5eac3564\") " Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.087368 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.087388 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.087399 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/336beec4-e534-448f-8367-78645b53650e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.087407 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.087416 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c5a8678-8ce2-4bee-9160-37b1dea9f897-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.087471 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.087506 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data podName:7b096325-542d-4ac6-8d16-8aa0937013b2 nodeName:}" failed. No retries permitted until 2026-02-19 21:49:59.087492859 +0000 UTC m=+1310.280010723 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data") pod "rabbitmq-server-0" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2") : configmap "rabbitmq-config-data" not found Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.087702 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7ssdv"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.087847 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "eee0ea5d-4b43-4421-b23e-555c5eac3564" (UID: "eee0ea5d-4b43-4421-b23e-555c5eac3564"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.092418 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eee0ea5d-4b43-4421-b23e-555c5eac3564-config" (OuterVolumeSpecName: "config") pod "eee0ea5d-4b43-4421-b23e-555c5eac3564" (UID: "eee0ea5d-4b43-4421-b23e-555c5eac3564"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.092851 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "eee0ea5d-4b43-4421-b23e-555c5eac3564" (UID: "eee0ea5d-4b43-4421-b23e-555c5eac3564"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.097000 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7ssdv"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.103871 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9f51-account-create-update-z87p8"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.107444 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee0ea5d-4b43-4421-b23e-555c5eac3564-kube-api-access-6wdlj" (OuterVolumeSpecName: "kube-api-access-6wdlj") pod "eee0ea5d-4b43-4421-b23e-555c5eac3564" (UID: "eee0ea5d-4b43-4421-b23e-555c5eac3564"). InnerVolumeSpecName "kube-api-access-6wdlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.110636 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f51-account-create-update-z87p8" event={"ID":"b0500ca0-0cef-4b76-9c78-cb2189b520ff","Type":"ContainerStarted","Data":"72ececc667f06320402ffe76c00f9ed550ee45b4a8e93c91bfe4b2261f921f68"} Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.139736 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:49:57 crc kubenswrapper[4795]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: if [ -n "glance" ]; then Feb 19 21:49:57 crc kubenswrapper[4795]: GRANT_DATABASE="glance" Feb 19 21:49:57 crc kubenswrapper[4795]: else Feb 19 21:49:57 crc kubenswrapper[4795]: GRANT_DATABASE="*" Feb 19 21:49:57 crc kubenswrapper[4795]: fi Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: # going for maximum compatibility here: Feb 19 21:49:57 crc kubenswrapper[4795]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:49:57 crc kubenswrapper[4795]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:49:57 crc kubenswrapper[4795]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:49:57 crc kubenswrapper[4795]: # support updates Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.140885 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-f769-account-create-update-8k7r2" podUID="8eaa69df-d563-4dc0-8a78-40413946cbca" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.144703 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.144942 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f2710b23-7a5c-44cb-b916-9e08edc59636" containerName="nova-scheduler-scheduler" containerID="cri-o://f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" gracePeriod=30 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.145882 4795 generic.go:334] "Generic (PLEG): container finished" podID="e4a6a069-904a-4072-b98c-346f67f22def" containerID="d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.145982 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f98bf9994-pr48x" event={"ID":"e4a6a069-904a-4072-b98c-346f67f22def","Type":"ContainerDied","Data":"d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.149382 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "3c2bcb9c-07d3-4d71-924b-aacd537e3430" (UID: "3c2bcb9c-07d3-4d71-924b-aacd537e3430"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.149486 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eee0ea5d-4b43-4421-b23e-555c5eac3564" (UID: "eee0ea5d-4b43-4421-b23e-555c5eac3564"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.162957 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f769-account-create-update-8k7r2"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.167272 4795 generic.go:334] "Generic (PLEG): container finished" podID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerID="fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.167339 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576c65f985-r97z7" event={"ID":"30f2c894-7a7a-4e5a-a090-a28ab50c766a","Type":"ContainerDied","Data":"fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.181889 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" containerName="galera" containerID="cri-o://ae770d1785f3d5deaa5b1b98adf315fd8a61a2cbce434ecb3970ab496579b196" gracePeriod=30 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.192622 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2bcb9c-07d3-4d71-924b-aacd537e3430-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.192649 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wdlj\" (UniqueName: \"kubernetes.io/projected/eee0ea5d-4b43-4421-b23e-555c5eac3564-kube-api-access-6wdlj\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.192658 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.192667 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eee0ea5d-4b43-4421-b23e-555c5eac3564-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.192675 4795 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovs-rundir\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.192683 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/eee0ea5d-4b43-4421-b23e-555c5eac3564-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.193474 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3c5a8678-8ce2-4bee-9160-37b1dea9f897/ovsdbserver-nb/0.log" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.193555 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3c5a8678-8ce2-4bee-9160-37b1dea9f897","Type":"ContainerDied","Data":"f3cf06abd80bf98f00707db38827bef51395b42a731dbd8362aaf2be900fcb70"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.193592 4795 scope.go:117] "RemoveContainer" containerID="9afa6d2d9c1c3abf2795f79e56d0f2c85700d5b1b8b011dec52725e8bbc63d6b" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.193709 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.222634 4795 generic.go:334] "Generic (PLEG): container finished" podID="250e9cae-06d9-44da-88af-239d15356a3c" containerID="f1227cc577cca2da8d7067560947f94ac089651b67e263368202e928196a7bc8" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.222665 4795 generic.go:334] "Generic (PLEG): container finished" podID="250e9cae-06d9-44da-88af-239d15356a3c" containerID="0d1ad96e830846d9034790e753be1811679f1bbf15af5f12e70081c7ae374cfd" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.222745 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd95cf589-2gw48" event={"ID":"250e9cae-06d9-44da-88af-239d15356a3c","Type":"ContainerDied","Data":"f1227cc577cca2da8d7067560947f94ac089651b67e263368202e928196a7bc8"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.222773 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd95cf589-2gw48" event={"ID":"250e9cae-06d9-44da-88af-239d15356a3c","Type":"ContainerDied","Data":"0d1ad96e830846d9034790e753be1811679f1bbf15af5f12e70081c7ae374cfd"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.225526 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-p9cs4_eee0ea5d-4b43-4421-b23e-555c5eac3564/openstack-network-exporter/0.log" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.225664 4795 generic.go:334] "Generic (PLEG): container finished" podID="eee0ea5d-4b43-4421-b23e-555c5eac3564" containerID="3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4" exitCode=2 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.225815 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p9cs4" event={"ID":"eee0ea5d-4b43-4421-b23e-555c5eac3564","Type":"ContainerDied","Data":"3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.225932 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p9cs4" event={"ID":"eee0ea5d-4b43-4421-b23e-555c5eac3564","Type":"ContainerDied","Data":"eaba90113d6ff0b858d733af82b8a4a862659df0d41e63fdc645db66d9298341"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.226053 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p9cs4" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.227639 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "eee0ea5d-4b43-4421-b23e-555c5eac3564" (UID: "eee0ea5d-4b43-4421-b23e-555c5eac3564"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.230969 4795 generic.go:334] "Generic (PLEG): container finished" podID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerID="46487241a29d4cc3bff33a03b2f13ce2e328740a30388d55d6c987233cf2d399" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.231294 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d2561f4e-0a01-4927-96f8-ee7bef69f561","Type":"ContainerDied","Data":"46487241a29d4cc3bff33a03b2f13ce2e328740a30388d55d6c987233cf2d399"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.233642 4795 generic.go:334] "Generic (PLEG): container finished" podID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerID="374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.233681 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" event={"ID":"4532c069-4eb7-48ab-b575-b6a130e2b438","Type":"ContainerDied","Data":"374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.234802 4795 generic.go:334] "Generic (PLEG): container finished" podID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerID="a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.234871 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107","Type":"ContainerDied","Data":"a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238691 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="bb960073c3b2955d7aa2d18d3eb2e0958e7e98f4cd499d7077f5064d1e43a05e" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238714 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="8d3a569ab5140e595996d2c82fd170ed28aa9420de4fdae36e9b5854b2e0bd5e" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238779 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="2ce7f7a343a6c79fd57a6b4c7cea8f6f21ccfbc5ea5261b4c592c5cc2035910e" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238789 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="cb0176a835c07bb843ea9834f19b5792b6d9700c5cd61a140ec8b99a66854f5f" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238796 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="c8d438309dbe4ee742eb9d7a2b93e755a74c9ba2dd39409dcf7caf84dee6405a" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238802 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="c5d0640985105b2140d43ecf956f5621f6e82eb5a3d40d95fb3d09d303406c84" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238808 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="f2b55f40d9ab92e06fdc09f65e72764f7f9c63fbb1f126ede2058624236d001f" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238815 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="5019fca913d092cd1d004e058553c364ac08007bafeae027b681e3bb6eb59026" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238821 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="be1c394688c447ed772b4929317159025a1e97491b40b847644ed369351532b5" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238826 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="a86db1a02ddab5086097179b35e6f17d71c32f36157625ee70912b23839603d4" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238833 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="d2281de4777acfa86c800d06ed4c2e0ac8613cf4008b8449cd7089d057ee51ec" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238839 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="22441008e17545864d7c6366d4ab4fa8333a1c04e36b9961e8fdfdfaeec8b1b6" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238844 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="b4b0474dd3a4192273fa0b2dc273a792e955cd0dde33e24a1afa65bb56656eaa" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238850 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="51c55baa52f08bcb95276c0f7a67a7ef348b9bd02a9fc401f50e679f37e0c117" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238881 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"bb960073c3b2955d7aa2d18d3eb2e0958e7e98f4cd499d7077f5064d1e43a05e"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238895 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"8d3a569ab5140e595996d2c82fd170ed28aa9420de4fdae36e9b5854b2e0bd5e"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238905 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"2ce7f7a343a6c79fd57a6b4c7cea8f6f21ccfbc5ea5261b4c592c5cc2035910e"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238913 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"cb0176a835c07bb843ea9834f19b5792b6d9700c5cd61a140ec8b99a66854f5f"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238921 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"c8d438309dbe4ee742eb9d7a2b93e755a74c9ba2dd39409dcf7caf84dee6405a"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238929 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"c5d0640985105b2140d43ecf956f5621f6e82eb5a3d40d95fb3d09d303406c84"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238938 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"f2b55f40d9ab92e06fdc09f65e72764f7f9c63fbb1f126ede2058624236d001f"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238946 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"5019fca913d092cd1d004e058553c364ac08007bafeae027b681e3bb6eb59026"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"be1c394688c447ed772b4929317159025a1e97491b40b847644ed369351532b5"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238961 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"a86db1a02ddab5086097179b35e6f17d71c32f36157625ee70912b23839603d4"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238969 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"d2281de4777acfa86c800d06ed4c2e0ac8613cf4008b8449cd7089d057ee51ec"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238977 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"22441008e17545864d7c6366d4ab4fa8333a1c04e36b9961e8fdfdfaeec8b1b6"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"b4b0474dd3a4192273fa0b2dc273a792e955cd0dde33e24a1afa65bb56656eaa"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.238992 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"51c55baa52f08bcb95276c0f7a67a7ef348b9bd02a9fc401f50e679f37e0c117"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.243143 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3c2bcb9c-07d3-4d71-924b-aacd537e3430/ovsdbserver-sb/0.log" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.243391 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.243526 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3c2bcb9c-07d3-4d71-924b-aacd537e3430","Type":"ContainerDied","Data":"0753bcc18c087ec61d4625b239ed921fd6b476f148310ba726f57a4cfa8d345c"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.251300 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c741-account-create-update-26ljt" event={"ID":"3e65bdd0-b6ac-406d-bc79-ade76397295e","Type":"ContainerStarted","Data":"75197e9e2a0c3d83dd9ff3b61a9f77e4a300ce7b88d838cd67a1b818ba9ef069"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.270802 4795 generic.go:334] "Generic (PLEG): container finished" podID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" exitCode=0 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.271059 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tl5hf" event={"ID":"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3","Type":"ContainerDied","Data":"e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.281376 4795 generic.go:334] "Generic (PLEG): container finished" podID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerID="16d1434f729c32f7f5af098d1664dafb8ed3d4636079462bf6c45b1454ee08ef" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.281473 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6df95dfbd4-ftf6x" event={"ID":"b01bcd5b-435a-4702-b0a4-8dfe8f553c23","Type":"ContainerDied","Data":"16d1434f729c32f7f5af098d1664dafb8ed3d4636079462bf6c45b1454ee08ef"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.294964 4795 generic.go:334] "Generic (PLEG): container finished" podID="793bbadc-8b53-4084-a63a-0b76b37284df" containerID="4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.295209 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"793bbadc-8b53-4084-a63a-0b76b37284df","Type":"ContainerDied","Data":"4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.295350 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eee0ea5d-4b43-4421-b23e-555c5eac3564-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.296500 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-gzkhl"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.298867 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-vk4hv" event={"ID":"74df8ac0-77f2-4e8d-aa39-d05dc6ce7190","Type":"ContainerDied","Data":"cdc18778f810815fa368ef0cc45dbb0e103ffbfcfa9231a83aa5be2dd6cfe1c2"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.298966 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-vk4hv" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.323551 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2d62-account-create-update-4vs7v"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.343006 4795 generic.go:334] "Generic (PLEG): container finished" podID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerID="46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.343076 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9","Type":"ContainerDied","Data":"46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f"} Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.343307 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:49:57 crc kubenswrapper[4795]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: if [ -n "nova_cell0" ]; then Feb 19 21:49:57 crc kubenswrapper[4795]: GRANT_DATABASE="nova_cell0" Feb 19 21:49:57 crc kubenswrapper[4795]: else Feb 19 21:49:57 crc kubenswrapper[4795]: GRANT_DATABASE="*" Feb 19 21:49:57 crc kubenswrapper[4795]: fi Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: # going for maximum compatibility here: Feb 19 21:49:57 crc kubenswrapper[4795]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:49:57 crc kubenswrapper[4795]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:49:57 crc kubenswrapper[4795]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:49:57 crc kubenswrapper[4795]: # support updates Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.344890 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-1922-account-create-update-gzkhl" podUID="ee3fde95-91bf-4f6a-9753-f879d56fedbb" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.347657 4795 generic.go:334] "Generic (PLEG): container finished" podID="2164f9d1-1d8b-486b-beca-0d3a5172b302" containerID="e72a80485fc6a5dd8d829ffc139130a167468a52119185e5d82a547efb9020bb" exitCode=1 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.347846 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ktl2b" event={"ID":"2164f9d1-1d8b-486b-beca-0d3a5172b302","Type":"ContainerDied","Data":"e72a80485fc6a5dd8d829ffc139130a167468a52119185e5d82a547efb9020bb"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.348122 4795 scope.go:117] "RemoveContainer" containerID="e72a80485fc6a5dd8d829ffc139130a167468a52119185e5d82a547efb9020bb" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.358449 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d602-account-create-update-lcd8k" event={"ID":"10e13a52-b0f3-447a-b47e-2c4dd50d6400","Type":"ContainerStarted","Data":"6e320135d03de930dfd2e7664584c07e81ae7f341dfa862b733a714182521b02"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.367759 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.389318 4795 generic.go:334] "Generic (PLEG): container finished" podID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerID="e797be576a87ea7d2cd1a10d4fb93c6e0f25a6a5bebf1abb85c7b6e12aa13e38" exitCode=143 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.389395 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.389392 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3697a3b0-4077-4837-bcdc-c17d8aa361f1","Type":"ContainerDied","Data":"e797be576a87ea7d2cd1a10d4fb93c6e0f25a6a5bebf1abb85c7b6e12aa13e38"} Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.400492 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts\") pod \"nova-cell1-e48f-account-create-update-n77q8\" (UID: \"7400eda6-e731-4942-b002-c81dd9a87e6a\") " pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.400606 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8sbw\" (UniqueName: \"kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw\") pod \"nova-cell1-e48f-account-create-update-n77q8\" (UID: \"7400eda6-e731-4942-b002-c81dd9a87e6a\") " pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.400670 4795 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.400744 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts podName:7400eda6-e731-4942-b002-c81dd9a87e6a nodeName:}" failed. No retries permitted until 2026-02-19 21:49:59.400724863 +0000 UTC m=+1310.593242727 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts") pod "nova-cell1-e48f-account-create-update-n77q8" (UID: "7400eda6-e731-4942-b002-c81dd9a87e6a") : configmap "openstack-cell1-scripts" not found Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.401477 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 21:49:57 crc kubenswrapper[4795]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: if [ -n "nova_api" ]; then Feb 19 21:49:57 crc kubenswrapper[4795]: GRANT_DATABASE="nova_api" Feb 19 21:49:57 crc kubenswrapper[4795]: else Feb 19 21:49:57 crc kubenswrapper[4795]: GRANT_DATABASE="*" Feb 19 21:49:57 crc kubenswrapper[4795]: fi Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: # going for maximum compatibility here: Feb 19 21:49:57 crc kubenswrapper[4795]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 19 21:49:57 crc kubenswrapper[4795]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 19 21:49:57 crc kubenswrapper[4795]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 19 21:49:57 crc kubenswrapper[4795]: # support updates Feb 19 21:49:57 crc kubenswrapper[4795]: Feb 19 21:49:57 crc kubenswrapper[4795]: $MYSQL_CMD < logger="UnhandledError" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.402618 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-2d62-account-create-update-4vs7v" podUID="299d8d1c-c181-4c7b-b95f-9f3c62ddb102" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.408031 4795 projected.go:194] Error preparing data for projected volume kube-api-access-z8sbw for pod openstack/nova-cell1-e48f-account-create-update-n77q8: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.408098 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw podName:7400eda6-e731-4942-b002-c81dd9a87e6a nodeName:}" failed. No retries permitted until 2026-02-19 21:49:59.4080801 +0000 UTC m=+1310.600597964 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-z8sbw" (UniqueName: "kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw") pod "nova-cell1-e48f-account-create-update-n77q8" (UID: "7400eda6-e731-4942-b002-c81dd9a87e6a") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.441536 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.527763 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7677694455-vk4hv"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.553572 4795 scope.go:117] "RemoveContainer" containerID="146b7776ad30bb62917c430a4cb976669fc9f5db740f7f370033e5be6e16f033" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.553784 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.557879 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7677694455-vk4hv"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.587241 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.616258 4795 scope.go:117] "RemoveContainer" containerID="3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.633687 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18561896-d336-4962-8e9e-4ccf748f8605" path="/var/lib/kubelet/pods/18561896-d336-4962-8e9e-4ccf748f8605/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.634541 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1946f4fd-5254-4e66-8739-5a51af23e963" path="/var/lib/kubelet/pods/1946f4fd-5254-4e66-8739-5a51af23e963/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.636728 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29d85454-a8db-47bc-b616-bbdb4f6d8920" path="/var/lib/kubelet/pods/29d85454-a8db-47bc-b616-bbdb4f6d8920/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.637517 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="336beec4-e534-448f-8367-78645b53650e" path="/var/lib/kubelet/pods/336beec4-e534-448f-8367-78645b53650e/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.638808 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3778f66e-fd7f-4af5-ae3e-2a7c272785a0" path="/var/lib/kubelet/pods/3778f66e-fd7f-4af5-ae3e-2a7c272785a0/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.639299 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.640051 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cb700b2-4c29-4deb-a379-d18f2695dcaf" path="/var/lib/kubelet/pods/4cb700b2-4c29-4deb-a379-d18f2695dcaf/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.640327 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.640436 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.640718 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="573a7aa5-43d9-4523-8eea-4c1a36da49fb" path="/var/lib/kubelet/pods/573a7aa5-43d9-4523-8eea-4c1a36da49fb/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.641316 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6952c796-d85e-49b3-b931-60966311a0c0" path="/var/lib/kubelet/pods/6952c796-d85e-49b3-b931-60966311a0c0/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.643226 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c" path="/var/lib/kubelet/pods/6b9165f4-18c2-4003-a3e6-2ed6e0fbdd3c/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.643703 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.643851 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.643880 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.647648 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.647713 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.648315 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73f01f44-1467-442f-b91f-ac1765626a3d" path="/var/lib/kubelet/pods/73f01f44-1467-442f-b91f-ac1765626a3d/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.649149 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" path="/var/lib/kubelet/pods/74df8ac0-77f2-4e8d-aa39-d05dc6ce7190/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.649866 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2c0e289-4e3b-4b5a-93db-d38621a870ec" path="/var/lib/kubelet/pods/a2c0e289-4e3b-4b5a-93db-d38621a870ec/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.653409 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8" path="/var/lib/kubelet/pods/ccb5e07b-c4f8-4d10-82bc-f780c9de6ff8/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.658259 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1d3a710-addc-4f86-b77c-0d05dc98695f" path="/var/lib/kubelet/pods/d1d3a710-addc-4f86-b77c-0d05dc98695f/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.664400 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddebb2b6-7bc0-45af-ba68-ae108b0d91fd" path="/var/lib/kubelet/pods/ddebb2b6-7bc0-45af-ba68-ae108b0d91fd/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.665288 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef98a0b8-d6d9-4075-ae60-e7d614a79e7f" path="/var/lib/kubelet/pods/ef98a0b8-d6d9-4075-ae60-e7d614a79e7f/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.666020 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8945f31-b1d9-4c65-9f8c-2619f87d4237" path="/var/lib/kubelet/pods/f8945f31-b1d9-4c65-9f8c-2619f87d4237/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.666782 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8cadc24-23ed-4063-8e1a-47a27c1d6ffd" path="/var/lib/kubelet/pods/f8cadc24-23ed-4063-8e1a-47a27c1d6ffd/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.667940 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd8e89cd-b890-4f36-9008-59767ccbad91" path="/var/lib/kubelet/pods/fd8e89cd-b890-4f36-9008-59767ccbad91/volumes" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.671745 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.671780 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.707399 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0500ca0-0cef-4b76-9c78-cb2189b520ff-operator-scripts\") pod \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\" (UID: \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\") " Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.707536 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc6xc\" (UniqueName: \"kubernetes.io/projected/b0500ca0-0cef-4b76-9c78-cb2189b520ff-kube-api-access-pc6xc\") pod \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\" (UID: \"b0500ca0-0cef-4b76-9c78-cb2189b520ff\") " Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.710687 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0500ca0-0cef-4b76-9c78-cb2189b520ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0500ca0-0cef-4b76-9c78-cb2189b520ff" (UID: "b0500ca0-0cef-4b76-9c78-cb2189b520ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.718860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0500ca0-0cef-4b76-9c78-cb2189b520ff-kube-api-access-pc6xc" (OuterVolumeSpecName: "kube-api-access-pc6xc") pod "b0500ca0-0cef-4b76-9c78-cb2189b520ff" (UID: "b0500ca0-0cef-4b76-9c78-cb2189b520ff"). InnerVolumeSpecName "kube-api-access-pc6xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.752104 4795 scope.go:117] "RemoveContainer" containerID="3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4" Feb 19 21:49:57 crc kubenswrapper[4795]: E0219 21:49:57.765495 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4\": container with ID starting with 3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4 not found: ID does not exist" containerID="3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.765541 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4"} err="failed to get container status \"3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4\": rpc error: code = NotFound desc = could not find container \"3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4\": container with ID starting with 3d2f2fb944954b1447f44f88d21764d52d00b1e2faecd4fc556d6cdb1c4d63f4 not found: ID does not exist" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.765576 4795 scope.go:117] "RemoveContainer" containerID="e932ce1114c0c3a49c8f6332f06a4a9aadb5f1382200346f37f0e3e9fe2d3373" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.805918 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.816007 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc6xc\" (UniqueName: \"kubernetes.io/projected/b0500ca0-0cef-4b76-9c78-cb2189b520ff-kube-api-access-pc6xc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.816045 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0500ca0-0cef-4b76-9c78-cb2189b520ff-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.897535 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.898717 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.903836 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.935561 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.947000 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-p9cs4"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.948232 4795 scope.go:117] "RemoveContainer" containerID="fc2aa4e6ca186f0a3259cef3b73108248f8b30394d6e7f6ee3356a21235bd96b" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.963232 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-p9cs4"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.970879 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.973407 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-858c4dcd57-whkj2"] Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.973651 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-858c4dcd57-whkj2" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerName="proxy-httpd" containerID="cri-o://38273143a291266be6dd29c71788a99ae4aa366ccd575844722fcc6687631e66" gracePeriod=30 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.973781 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-858c4dcd57-whkj2" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerName="proxy-server" containerID="cri-o://812bd35c706001e18c0da5e2c3dd17a059f42e984bdd2e12b92f8fd91195b1e2" gracePeriod=30 Feb 19 21:49:57 crc kubenswrapper[4795]: I0219 21:49:57.990006 4795 scope.go:117] "RemoveContainer" containerID="0b266e887f3cb16fe191b1e64cc8d8adf7464d874863c315ba4605ce8d79b678" Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.014629 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.016762 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.018391 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.018462 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f2710b23-7a5c-44cb-b916-9e08edc59636" containerName="nova-scheduler-scheduler" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.022613 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e65bdd0-b6ac-406d-bc79-ade76397295e-operator-scripts\") pod \"3e65bdd0-b6ac-406d-bc79-ade76397295e\" (UID: \"3e65bdd0-b6ac-406d-bc79-ade76397295e\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.022674 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data\") pod \"250e9cae-06d9-44da-88af-239d15356a3c\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.022709 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-combined-ca-bundle\") pod \"250e9cae-06d9-44da-88af-239d15356a3c\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.022760 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w65w5\" (UniqueName: \"kubernetes.io/projected/10e13a52-b0f3-447a-b47e-2c4dd50d6400-kube-api-access-w65w5\") pod \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\" (UID: \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.022786 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10e13a52-b0f3-447a-b47e-2c4dd50d6400-operator-scripts\") pod \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\" (UID: \"10e13a52-b0f3-447a-b47e-2c4dd50d6400\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.022880 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/250e9cae-06d9-44da-88af-239d15356a3c-logs\") pod \"250e9cae-06d9-44da-88af-239d15356a3c\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.023272 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzgh4\" (UniqueName: \"kubernetes.io/projected/3e65bdd0-b6ac-406d-bc79-ade76397295e-kube-api-access-mzgh4\") pod \"3e65bdd0-b6ac-406d-bc79-ade76397295e\" (UID: \"3e65bdd0-b6ac-406d-bc79-ade76397295e\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.023459 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6npn5\" (UniqueName: \"kubernetes.io/projected/250e9cae-06d9-44da-88af-239d15356a3c-kube-api-access-6npn5\") pod \"250e9cae-06d9-44da-88af-239d15356a3c\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.023533 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data-custom\") pod \"250e9cae-06d9-44da-88af-239d15356a3c\" (UID: \"250e9cae-06d9-44da-88af-239d15356a3c\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.024543 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/250e9cae-06d9-44da-88af-239d15356a3c-logs" (OuterVolumeSpecName: "logs") pod "250e9cae-06d9-44da-88af-239d15356a3c" (UID: "250e9cae-06d9-44da-88af-239d15356a3c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.029343 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e65bdd0-b6ac-406d-bc79-ade76397295e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e65bdd0-b6ac-406d-bc79-ade76397295e" (UID: "3e65bdd0-b6ac-406d-bc79-ade76397295e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.029640 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10e13a52-b0f3-447a-b47e-2c4dd50d6400-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "10e13a52-b0f3-447a-b47e-2c4dd50d6400" (UID: "10e13a52-b0f3-447a-b47e-2c4dd50d6400"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.029690 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10e13a52-b0f3-447a-b47e-2c4dd50d6400-kube-api-access-w65w5" (OuterVolumeSpecName: "kube-api-access-w65w5") pod "10e13a52-b0f3-447a-b47e-2c4dd50d6400" (UID: "10e13a52-b0f3-447a-b47e-2c4dd50d6400"). InnerVolumeSpecName "kube-api-access-w65w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.029750 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "250e9cae-06d9-44da-88af-239d15356a3c" (UID: "250e9cae-06d9-44da-88af-239d15356a3c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.031655 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e65bdd0-b6ac-406d-bc79-ade76397295e-kube-api-access-mzgh4" (OuterVolumeSpecName: "kube-api-access-mzgh4") pod "3e65bdd0-b6ac-406d-bc79-ade76397295e" (UID: "3e65bdd0-b6ac-406d-bc79-ade76397295e"). InnerVolumeSpecName "kube-api-access-mzgh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.032132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/250e9cae-06d9-44da-88af-239d15356a3c-kube-api-access-6npn5" (OuterVolumeSpecName: "kube-api-access-6npn5") pod "250e9cae-06d9-44da-88af-239d15356a3c" (UID: "250e9cae-06d9-44da-88af-239d15356a3c"). InnerVolumeSpecName "kube-api-access-6npn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.032616 4795 scope.go:117] "RemoveContainer" containerID="445b39298e77e683eee2d951a286aa4f9de79beb92f370dca4d81f0dfe56d255" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.057921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "250e9cae-06d9-44da-88af-239d15356a3c" (UID: "250e9cae-06d9-44da-88af-239d15356a3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.091373 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data" (OuterVolumeSpecName: "config-data") pod "250e9cae-06d9-44da-88af-239d15356a3c" (UID: "250e9cae-06d9-44da-88af-239d15356a3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.115665 4795 scope.go:117] "RemoveContainer" containerID="6f5fd7fb0db869022abdd3586a7debaf90502df56c7437b86541c8afbbd3687a" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126269 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e65bdd0-b6ac-406d-bc79-ade76397295e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126300 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126309 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126318 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w65w5\" (UniqueName: \"kubernetes.io/projected/10e13a52-b0f3-447a-b47e-2c4dd50d6400-kube-api-access-w65w5\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126328 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10e13a52-b0f3-447a-b47e-2c4dd50d6400-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126335 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/250e9cae-06d9-44da-88af-239d15356a3c-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126343 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzgh4\" (UniqueName: \"kubernetes.io/projected/3e65bdd0-b6ac-406d-bc79-ade76397295e-kube-api-access-mzgh4\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126352 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6npn5\" (UniqueName: \"kubernetes.io/projected/250e9cae-06d9-44da-88af-239d15356a3c-kube-api-access-6npn5\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.126406 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/250e9cae-06d9-44da-88af-239d15356a3c-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.228057 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.228503 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data podName:ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d nodeName:}" failed. No retries permitted until 2026-02-19 21:50:02.228238412 +0000 UTC m=+1313.420756276 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data") pod "rabbitmq-cell1-server-0" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d") : configmap "rabbitmq-cell1-config-data" not found Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.234091 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3 is running failed: container process not found" containerID="4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.234580 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3 is running failed: container process not found" containerID="4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.235762 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3 is running failed: container process not found" containerID="4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.235791 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="f6fd7841-2a08-4786-8e96-b2ab0f477eff" containerName="nova-cell1-conductor-conductor" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.345382 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.408529 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9f51-account-create-update-z87p8" event={"ID":"b0500ca0-0cef-4b76-9c78-cb2189b520ff","Type":"ContainerDied","Data":"72ececc667f06320402ffe76c00f9ed550ee45b4a8e93c91bfe4b2261f921f68"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.408616 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9f51-account-create-update-z87p8" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.415580 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c741-account-create-update-26ljt" event={"ID":"3e65bdd0-b6ac-406d-bc79-ade76397295e","Type":"ContainerDied","Data":"75197e9e2a0c3d83dd9ff3b61a9f77e4a300ce7b88d838cd67a1b818ba9ef069"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.415675 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c741-account-create-update-26ljt" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.423015 4795 generic.go:334] "Generic (PLEG): container finished" podID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerID="b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1" exitCode=0 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.423088 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c54f77a4-1095-4ff1-bc74-b845cde659d9","Type":"ContainerDied","Data":"b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.426448 4795 generic.go:334] "Generic (PLEG): container finished" podID="2164f9d1-1d8b-486b-beca-0d3a5172b302" containerID="37b5ade45a3dd017a14fe70f806fa2f9b2438f7ec74aeb16d3207b0a96e2916a" exitCode=1 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.427106 4795 scope.go:117] "RemoveContainer" containerID="37b5ade45a3dd017a14fe70f806fa2f9b2438f7ec74aeb16d3207b0a96e2916a" Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.427552 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-ktl2b_openstack(2164f9d1-1d8b-486b-beca-0d3a5172b302)\"" pod="openstack/root-account-create-update-ktl2b" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.427778 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ktl2b" event={"ID":"2164f9d1-1d8b-486b-beca-0d3a5172b302","Type":"ContainerDied","Data":"37b5ade45a3dd017a14fe70f806fa2f9b2438f7ec74aeb16d3207b0a96e2916a"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.427804 4795 scope.go:117] "RemoveContainer" containerID="e72a80485fc6a5dd8d829ffc139130a167468a52119185e5d82a547efb9020bb" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.429995 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.430039 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.430516 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-vencrypt-tls-certs\") pod \"0adadcd9-8949-443b-8042-d0d11191eae9\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.430576 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-nova-novncproxy-tls-certs\") pod \"0adadcd9-8949-443b-8042-d0d11191eae9\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.430610 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-combined-ca-bundle\") pod \"0adadcd9-8949-443b-8042-d0d11191eae9\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.430665 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-config-data\") pod \"0adadcd9-8949-443b-8042-d0d11191eae9\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.430693 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz5kj\" (UniqueName: \"kubernetes.io/projected/0adadcd9-8949-443b-8042-d0d11191eae9-kube-api-access-cz5kj\") pod \"0adadcd9-8949-443b-8042-d0d11191eae9\" (UID: \"0adadcd9-8949-443b-8042-d0d11191eae9\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.443321 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0adadcd9-8949-443b-8042-d0d11191eae9-kube-api-access-cz5kj" (OuterVolumeSpecName: "kube-api-access-cz5kj") pod "0adadcd9-8949-443b-8042-d0d11191eae9" (UID: "0adadcd9-8949-443b-8042-d0d11191eae9"). InnerVolumeSpecName "kube-api-access-cz5kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.462373 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d602-account-create-update-lcd8k" event={"ID":"10e13a52-b0f3-447a-b47e-2c4dd50d6400","Type":"ContainerDied","Data":"6e320135d03de930dfd2e7664584c07e81ae7f341dfa862b733a714182521b02"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.462449 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d602-account-create-update-lcd8k" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.469218 4795 generic.go:334] "Generic (PLEG): container finished" podID="0adadcd9-8949-443b-8042-d0d11191eae9" containerID="0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f" exitCode=0 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.469805 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0adadcd9-8949-443b-8042-d0d11191eae9","Type":"ContainerDied","Data":"0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.469910 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0adadcd9-8949-443b-8042-d0d11191eae9","Type":"ContainerDied","Data":"6555271b3d98ead4145953a8b269ef0cca6677e13c5b918d43b6b24ff869130e"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.470319 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.491704 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0adadcd9-8949-443b-8042-d0d11191eae9" (UID: "0adadcd9-8949-443b-8042-d0d11191eae9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.510899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1922-account-create-update-gzkhl" event={"ID":"ee3fde95-91bf-4f6a-9753-f879d56fedbb","Type":"ContainerStarted","Data":"8f247d4d74ea4937a4f6282c8b5e4ccafc361e919f81dbaf677caedc499822b7"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.537045 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.537071 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz5kj\" (UniqueName: \"kubernetes.io/projected/0adadcd9-8949-443b-8042-d0d11191eae9-kube-api-access-cz5kj\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.549043 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "0adadcd9-8949-443b-8042-d0d11191eae9" (UID: "0adadcd9-8949-443b-8042-d0d11191eae9"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.556208 4795 generic.go:334] "Generic (PLEG): container finished" podID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" containerID="ae770d1785f3d5deaa5b1b98adf315fd8a61a2cbce434ecb3970ab496579b196" exitCode=0 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.556288 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d","Type":"ContainerDied","Data":"ae770d1785f3d5deaa5b1b98adf315fd8a61a2cbce434ecb3970ab496579b196"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.567712 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd95cf589-2gw48" event={"ID":"250e9cae-06d9-44da-88af-239d15356a3c","Type":"ContainerDied","Data":"26ff50c7b1851e9704bfa4221d66176820b3417a16cff032c2c82bc2945df7a8"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.567828 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cd95cf589-2gw48" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.580986 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2d62-account-create-update-4vs7v" event={"ID":"299d8d1c-c181-4c7b-b95f-9f3c62ddb102","Type":"ContainerStarted","Data":"a6e6e439c0ca9f9cb3832a5b2b0c274f01e0f16a2f8f82d2544c41499347ea51"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.592367 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-config-data" (OuterVolumeSpecName: "config-data") pod "0adadcd9-8949-443b-8042-d0d11191eae9" (UID: "0adadcd9-8949-443b-8042-d0d11191eae9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.599724 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "0adadcd9-8949-443b-8042-d0d11191eae9" (UID: "0adadcd9-8949-443b-8042-d0d11191eae9"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.601204 4795 generic.go:334] "Generic (PLEG): container finished" podID="f6fd7841-2a08-4786-8e96-b2ab0f477eff" containerID="4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3" exitCode=0 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.601514 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f6fd7841-2a08-4786-8e96-b2ab0f477eff","Type":"ContainerDied","Data":"4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.617729 4795 generic.go:334] "Generic (PLEG): container finished" podID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerID="38273143a291266be6dd29c71788a99ae4aa366ccd575844722fcc6687631e66" exitCode=0 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.617794 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-858c4dcd57-whkj2" event={"ID":"da2e3f89-bf0b-4371-8e5b-a0037f266c70","Type":"ContainerDied","Data":"38273143a291266be6dd29c71788a99ae4aa366ccd575844722fcc6687631e66"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.627823 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e48f-account-create-update-n77q8" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.628936 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f769-account-create-update-8k7r2" event={"ID":"8eaa69df-d563-4dc0-8a78-40413946cbca","Type":"ContainerStarted","Data":"7d6e2a5c198e5bc708e1811f1860c10bf53d61ca572544f39bca6ba9c9264ae5"} Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.638565 4795 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.638592 4795 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.638603 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0adadcd9-8949-443b-8042-d0d11191eae9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.661128 4795 scope.go:117] "RemoveContainer" containerID="0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.662257 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.695916 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c741-account-create-update-26ljt"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.700852 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c741-account-create-update-26ljt"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.718005 4795 scope.go:117] "RemoveContainer" containerID="0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.720275 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-n77q8"] Feb 19 21:49:58 crc kubenswrapper[4795]: E0219 21:49:58.723785 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f\": container with ID starting with 0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f not found: ID does not exist" containerID="0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.723955 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f"} err="failed to get container status \"0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f\": rpc error: code = NotFound desc = could not find container \"0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f\": container with ID starting with 0f773609fa3d94e54bc6067d53960a0ec55209c713b15b0460c6c5287772d27f not found: ID does not exist" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.724037 4795 scope.go:117] "RemoveContainer" containerID="f1227cc577cca2da8d7067560947f94ac089651b67e263368202e928196a7bc8" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.732331 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e48f-account-create-update-n77q8"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.739344 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-combined-ca-bundle\") pod \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.739433 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-config-data\") pod \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.739453 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtr87\" (UniqueName: \"kubernetes.io/projected/f6fd7841-2a08-4786-8e96-b2ab0f477eff-kube-api-access-qtr87\") pod \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\" (UID: \"f6fd7841-2a08-4786-8e96-b2ab0f477eff\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.743370 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6fd7841-2a08-4786-8e96-b2ab0f477eff-kube-api-access-qtr87" (OuterVolumeSpecName: "kube-api-access-qtr87") pod "f6fd7841-2a08-4786-8e96-b2ab0f477eff" (UID: "f6fd7841-2a08-4786-8e96-b2ab0f477eff"). InnerVolumeSpecName "kube-api-access-qtr87". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.745738 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9f51-account-create-update-z87p8"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.754282 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9f51-account-create-update-z87p8"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.796070 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d602-account-create-update-lcd8k"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.798742 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6fd7841-2a08-4786-8e96-b2ab0f477eff" (UID: "f6fd7841-2a08-4786-8e96-b2ab0f477eff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.801421 4795 scope.go:117] "RemoveContainer" containerID="0d1ad96e830846d9034790e753be1811679f1bbf15af5f12e70081c7ae374cfd" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.802469 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-config-data" (OuterVolumeSpecName: "config-data") pod "f6fd7841-2a08-4786-8e96-b2ab0f477eff" (UID: "f6fd7841-2a08-4786-8e96-b2ab0f477eff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.825819 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d602-account-create-update-lcd8k"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.836498 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7cd95cf589-2gw48"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.841758 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.841791 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6fd7841-2a08-4786-8e96-b2ab0f477eff-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.841801 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtr87\" (UniqueName: \"kubernetes.io/projected/f6fd7841-2a08-4786-8e96-b2ab0f477eff-kube-api-access-qtr87\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.841812 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7400eda6-e731-4942-b002-c81dd9a87e6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.841822 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8sbw\" (UniqueName: \"kubernetes.io/projected/7400eda6-e731-4942-b002-c81dd9a87e6a-kube-api-access-z8sbw\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.849381 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7cd95cf589-2gw48"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.859248 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.863249 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.881789 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.953941 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data\") pod \"4532c069-4eb7-48ab-b575-b6a130e2b438\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.953993 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-combined-ca-bundle\") pod \"4532c069-4eb7-48ab-b575-b6a130e2b438\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.954017 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data-custom\") pod \"4532c069-4eb7-48ab-b575-b6a130e2b438\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.954129 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4hjh\" (UniqueName: \"kubernetes.io/projected/4532c069-4eb7-48ab-b575-b6a130e2b438-kube-api-access-v4hjh\") pod \"4532c069-4eb7-48ab-b575-b6a130e2b438\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.954178 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4532c069-4eb7-48ab-b575-b6a130e2b438-logs\") pod \"4532c069-4eb7-48ab-b575-b6a130e2b438\" (UID: \"4532c069-4eb7-48ab-b575-b6a130e2b438\") " Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.955005 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4532c069-4eb7-48ab-b575-b6a130e2b438-logs" (OuterVolumeSpecName: "logs") pod "4532c069-4eb7-48ab-b575-b6a130e2b438" (UID: "4532c069-4eb7-48ab-b575-b6a130e2b438"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.960596 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.960891 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="ceilometer-central-agent" containerID="cri-o://a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552" gracePeriod=30 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.961281 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="proxy-httpd" containerID="cri-o://c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26" gracePeriod=30 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.961323 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="sg-core" containerID="cri-o://21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb" gracePeriod=30 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.961354 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="ceilometer-notification-agent" containerID="cri-o://98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734" gracePeriod=30 Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.969528 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4532c069-4eb7-48ab-b575-b6a130e2b438-kube-api-access-v4hjh" (OuterVolumeSpecName: "kube-api-access-v4hjh") pod "4532c069-4eb7-48ab-b575-b6a130e2b438" (UID: "4532c069-4eb7-48ab-b575-b6a130e2b438"). InnerVolumeSpecName "kube-api-access-v4hjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:58 crc kubenswrapper[4795]: I0219 21:49:58.986794 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4532c069-4eb7-48ab-b575-b6a130e2b438" (UID: "4532c069-4eb7-48ab-b575-b6a130e2b438"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.000664 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.000868 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="296f6b57-de45-495d-abe9-8c779c157057" containerName="kube-state-metrics" containerID="cri-o://5a4765254bd1f522ba57b50f9f6989619b42ec50e10b34373400dceff9cc29f7" gracePeriod=30 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.002334 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4532c069-4eb7-48ab-b575-b6a130e2b438" (UID: "4532c069-4eb7-48ab-b575-b6a130e2b438"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.070339 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.097132 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.126428 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-41fb-account-create-update-ntc9w"] Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.141464 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.141539 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data podName:7b096325-542d-4ac6-8d16-8aa0937013b2 nodeName:}" failed. No retries permitted until 2026-02-19 21:50:03.141516561 +0000 UTC m=+1314.334034425 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data") pod "rabbitmq-server-0" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2") : configmap "rabbitmq-config-data" not found Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.146159 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.146280 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.146358 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4hjh\" (UniqueName: \"kubernetes.io/projected/4532c069-4eb7-48ab-b575-b6a130e2b438-kube-api-access-v4hjh\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.146477 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4532c069-4eb7-48ab-b575-b6a130e2b438-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.190213 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-41fb-account-create-update-ntc9w"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.203136 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.209159 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.209440 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" containerName="memcached" containerID="cri-o://db08b177837bd80146274836c0977b72219e93219854a2c6c5e81a24922a33fe" gracePeriod=30 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.248644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xc68\" (UniqueName: \"kubernetes.io/projected/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-kube-api-access-4xc68\") pod \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\" (UID: \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.248691 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-operator-scripts\") pod \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\" (UID: \"299d8d1c-c181-4c7b-b95f-9f3c62ddb102\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.248876 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq9sz\" (UniqueName: \"kubernetes.io/projected/ee3fde95-91bf-4f6a-9753-f879d56fedbb-kube-api-access-vq9sz\") pod \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\" (UID: \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.248900 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3fde95-91bf-4f6a-9753-f879d56fedbb-operator-scripts\") pod \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\" (UID: \"ee3fde95-91bf-4f6a-9753-f879d56fedbb\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.257920 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "299d8d1c-c181-4c7b-b95f-9f3c62ddb102" (UID: "299d8d1c-c181-4c7b-b95f-9f3c62ddb102"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.259241 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee3fde95-91bf-4f6a-9753-f879d56fedbb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee3fde95-91bf-4f6a-9753-f879d56fedbb" (UID: "ee3fde95-91bf-4f6a-9753-f879d56fedbb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.270234 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee3fde95-91bf-4f6a-9753-f879d56fedbb-kube-api-access-vq9sz" (OuterVolumeSpecName: "kube-api-access-vq9sz") pod "ee3fde95-91bf-4f6a-9753-f879d56fedbb" (UID: "ee3fde95-91bf-4f6a-9753-f879d56fedbb"). InnerVolumeSpecName "kube-api-access-vq9sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.283300 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-kube-api-access-4xc68" (OuterVolumeSpecName: "kube-api-access-4xc68") pod "299d8d1c-c181-4c7b-b95f-9f3c62ddb102" (UID: "299d8d1c-c181-4c7b-b95f-9f3c62ddb102"). InnerVolumeSpecName "kube-api-access-4xc68". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.299678 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-41fb-account-create-update-h4ql2"] Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300030 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerName="ovsdbserver-nb" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300047 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerName="ovsdbserver-nb" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300057 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0adadcd9-8949-443b-8042-d0d11191eae9" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300063 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0adadcd9-8949-443b-8042-d0d11191eae9" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300074 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerName="ovsdbserver-sb" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300080 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerName="ovsdbserver-sb" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300088 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" containerName="mysql-bootstrap" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300094 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" containerName="mysql-bootstrap" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300106 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300112 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300122 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6fd7841-2a08-4786-8e96-b2ab0f477eff" containerName="nova-cell1-conductor-conductor" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300127 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6fd7841-2a08-4786-8e96-b2ab0f477eff" containerName="nova-cell1-conductor-conductor" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300135 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250e9cae-06d9-44da-88af-239d15356a3c" containerName="barbican-worker-log" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300141 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="250e9cae-06d9-44da-88af-239d15356a3c" containerName="barbican-worker-log" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300152 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300158 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300180 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250e9cae-06d9-44da-88af-239d15356a3c" containerName="barbican-worker" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300186 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="250e9cae-06d9-44da-88af-239d15356a3c" containerName="barbican-worker" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300198 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerName="barbican-keystone-listener" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300203 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerName="barbican-keystone-listener" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300217 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerName="dnsmasq-dns" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300223 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerName="dnsmasq-dns" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300231 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerName="init" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300236 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerName="init" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300248 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerName="barbican-keystone-listener-log" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300254 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerName="barbican-keystone-listener-log" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300263 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" containerName="galera" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300269 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" containerName="galera" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.300277 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee0ea5d-4b43-4421-b23e-555c5eac3564" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300282 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee0ea5d-4b43-4421-b23e-555c5eac3564" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300424 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="250e9cae-06d9-44da-88af-239d15356a3c" containerName="barbican-worker-log" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300437 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee0ea5d-4b43-4421-b23e-555c5eac3564" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300447 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300456 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerName="ovsdbserver-nb" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300464 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerName="dnsmasq-dns" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300475 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" containerName="galera" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300485 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0adadcd9-8949-443b-8042-d0d11191eae9" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300495 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" containerName="openstack-network-exporter" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300506 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerName="barbican-keystone-listener" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300513 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" containerName="ovsdbserver-sb" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300521 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="250e9cae-06d9-44da-88af-239d15356a3c" containerName="barbican-worker" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300530 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6fd7841-2a08-4786-8e96-b2ab0f477eff" containerName="nova-cell1-conductor-conductor" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.300536 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerName="barbican-keystone-listener-log" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.301211 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.304504 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.306345 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data" (OuterVolumeSpecName: "config-data") pod "4532c069-4eb7-48ab-b575-b6a130e2b438" (UID: "4532c069-4eb7-48ab-b575-b6a130e2b438"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.328273 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-41fb-account-create-update-h4ql2"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.347219 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-dxql7"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.347263 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-dxql7"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.356685 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kolla-config\") pod \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.356729 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-operator-scripts\") pod \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.356781 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwqnm\" (UniqueName: \"kubernetes.io/projected/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kube-api-access-kwqnm\") pod \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.356802 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.356850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-default\") pod \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.356959 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-combined-ca-bundle\") pod \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.357012 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-galera-tls-certs\") pod \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.357041 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-generated\") pod \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\" (UID: \"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.357668 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" (UID: "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.359817 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" (UID: "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.360329 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" (UID: "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.366817 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts\") pod \"keystone-41fb-account-create-update-h4ql2\" (UID: \"38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9\") " pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.366951 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcb7p\" (UniqueName: \"kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p\") pod \"keystone-41fb-account-create-update-h4ql2\" (UID: \"38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9\") " pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367141 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee3fde95-91bf-4f6a-9753-f879d56fedbb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367157 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367179 4795 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367188 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367196 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xc68\" (UniqueName: \"kubernetes.io/projected/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-kube-api-access-4xc68\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367205 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/299d8d1c-c181-4c7b-b95f-9f3c62ddb102-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367214 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4532c069-4eb7-48ab-b575-b6a130e2b438-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367222 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq9sz\" (UniqueName: \"kubernetes.io/projected/ee3fde95-91bf-4f6a-9753-f879d56fedbb-kube-api-access-vq9sz\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.367253 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-nffrq"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.372783 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" (UID: "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.374502 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-nffrq"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.378511 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kube-api-access-kwqnm" (OuterVolumeSpecName: "kube-api-access-kwqnm") pod "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" (UID: "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d"). InnerVolumeSpecName "kube-api-access-kwqnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.384230 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6945f64f65-rnq2b"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.384474 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-6945f64f65-rnq2b" podUID="4b928260-ac65-479d-bd4b-f14b48d24ddb" containerName="keystone-api" containerID="cri-o://77a5c881bfb4b3162733203d6d06eed351c1cde8f2967461288b978f94eeb5ba" gracePeriod=30 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.401778 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" (UID: "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.414468 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-41fb-account-create-update-h4ql2"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.422593 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.433464 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-q8t82"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.470830 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts\") pod \"keystone-41fb-account-create-update-h4ql2\" (UID: \"38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9\") " pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.470902 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcb7p\" (UniqueName: \"kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p\") pod \"keystone-41fb-account-create-update-h4ql2\" (UID: \"38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9\") " pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.470982 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwqnm\" (UniqueName: \"kubernetes.io/projected/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-kube-api-access-kwqnm\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.471001 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.471012 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.471961 4795 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.472017 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts podName:38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9 nodeName:}" failed. No retries permitted until 2026-02-19 21:49:59.972002501 +0000 UTC m=+1311.164520355 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts") pod "keystone-41fb-account-create-update-h4ql2" (UID: "38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9") : configmap "openstack-scripts" not found Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.474964 4795 projected.go:194] Error preparing data for projected volume kube-api-access-qcb7p for pod openstack/keystone-41fb-account-create-update-h4ql2: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.475033 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p podName:38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9 nodeName:}" failed. No retries permitted until 2026-02-19 21:49:59.975012996 +0000 UTC m=+1311.167530860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qcb7p" (UniqueName: "kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p") pod "keystone-41fb-account-create-update-h4ql2" (UID: "38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.482867 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-q8t82"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.496830 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ktl2b"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.504409 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" (UID: "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.508296 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" (UID: "2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.547697 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.558082 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0adadcd9-8949-443b-8042-d0d11191eae9" path="/var/lib/kubelet/pods/0adadcd9-8949-443b-8042-d0d11191eae9/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.560848 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10e13a52-b0f3-447a-b47e-2c4dd50d6400" path="/var/lib/kubelet/pods/10e13a52-b0f3-447a-b47e-2c4dd50d6400/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.562413 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="250e9cae-06d9-44da-88af-239d15356a3c" path="/var/lib/kubelet/pods/250e9cae-06d9-44da-88af-239d15356a3c/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.563187 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c2bcb9c-07d3-4d71-924b-aacd537e3430" path="/var/lib/kubelet/pods/3c2bcb9c-07d3-4d71-924b-aacd537e3430/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.566582 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c5a8678-8ce2-4bee-9160-37b1dea9f897" path="/var/lib/kubelet/pods/3c5a8678-8ce2-4bee-9160-37b1dea9f897/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.567748 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e65bdd0-b6ac-406d-bc79-ade76397295e" path="/var/lib/kubelet/pods/3e65bdd0-b6ac-406d-bc79-ade76397295e/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.568217 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65449c45-b8f9-445e-80e7-6e3c8541c62c" path="/var/lib/kubelet/pods/65449c45-b8f9-445e-80e7-6e3c8541c62c/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.568705 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7400eda6-e731-4942-b002-c81dd9a87e6a" path="/var/lib/kubelet/pods/7400eda6-e731-4942-b002-c81dd9a87e6a/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.576257 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.576279 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.576289 4795 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.576503 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2a7b298-40b6-43b3-9099-ec74f2f0bfad" path="/var/lib/kubelet/pods/a2a7b298-40b6-43b3-9099-ec74f2f0bfad/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.577004 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0500ca0-0cef-4b76-9c78-cb2189b520ff" path="/var/lib/kubelet/pods/b0500ca0-0cef-4b76-9c78-cb2189b520ff/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.577522 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23" path="/var/lib/kubelet/pods/cc70b8f2-4f1b-4b6e-b657-66aac1cbfa23/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.578249 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee0ea5d-4b43-4421-b23e-555c5eac3564" path="/var/lib/kubelet/pods/eee0ea5d-4b43-4421-b23e-555c5eac3564/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.579429 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1c9562-0143-4fa4-86d3-f1ed93f3fa31" path="/var/lib/kubelet/pods/fb1c9562-0143-4fa4-86d3-f1ed93f3fa31/volumes" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.613219 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb01bcd5b_435a_4702_b0a4_8dfe8f553c23.slice/crio-conmon-9d2e881624f7800e5e6e027e5487084188713e96e14cab5c281c53ae829851a2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb01bcd5b_435a_4702_b0a4_8dfe8f553c23.slice/crio-9d2e881624f7800e5e6e027e5487084188713e96e14cab5c281c53ae829851a2.scope\": RecentStats: unable to find data in memory cache]" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.640833 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1922-account-create-update-gzkhl" event={"ID":"ee3fde95-91bf-4f6a-9753-f879d56fedbb","Type":"ContainerDied","Data":"8f247d4d74ea4937a4f6282c8b5e4ccafc361e919f81dbaf677caedc499822b7"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.640933 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1922-account-create-update-gzkhl" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.651072 4795 generic.go:334] "Generic (PLEG): container finished" podID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerID="812bd35c706001e18c0da5e2c3dd17a059f42e984bdd2e12b92f8fd91195b1e2" exitCode=0 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.651447 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-858c4dcd57-whkj2" event={"ID":"da2e3f89-bf0b-4371-8e5b-a0037f266c70","Type":"ContainerDied","Data":"812bd35c706001e18c0da5e2c3dd17a059f42e984bdd2e12b92f8fd91195b1e2"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.652922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2d62-account-create-update-4vs7v" event={"ID":"299d8d1c-c181-4c7b-b95f-9f3c62ddb102","Type":"ContainerDied","Data":"a6e6e439c0ca9f9cb3832a5b2b0c274f01e0f16a2f8f82d2544c41499347ea51"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.653109 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2d62-account-create-update-4vs7v" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.667613 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" containerName="galera" containerID="cri-o://0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153" gracePeriod=30 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.670005 4795 generic.go:334] "Generic (PLEG): container finished" podID="296f6b57-de45-495d-abe9-8c779c157057" containerID="5a4765254bd1f522ba57b50f9f6989619b42ec50e10b34373400dceff9cc29f7" exitCode=2 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.670066 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"296f6b57-de45-495d-abe9-8c779c157057","Type":"ContainerDied","Data":"5a4765254bd1f522ba57b50f9f6989619b42ec50e10b34373400dceff9cc29f7"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.676138 4795 generic.go:334] "Generic (PLEG): container finished" podID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerID="9d2e881624f7800e5e6e027e5487084188713e96e14cab5c281c53ae829851a2" exitCode=0 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.676215 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6df95dfbd4-ftf6x" event={"ID":"b01bcd5b-435a-4702-b0a4-8dfe8f553c23","Type":"ContainerDied","Data":"9d2e881624f7800e5e6e027e5487084188713e96e14cab5c281c53ae829851a2"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.681585 4795 generic.go:334] "Generic (PLEG): container finished" podID="e956453d-551f-44b4-8125-8656b3155402" containerID="c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26" exitCode=0 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.681622 4795 generic.go:334] "Generic (PLEG): container finished" podID="e956453d-551f-44b4-8125-8656b3155402" containerID="21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb" exitCode=2 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.681629 4795 generic.go:334] "Generic (PLEG): container finished" podID="e956453d-551f-44b4-8125-8656b3155402" containerID="a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552" exitCode=0 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.681674 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerDied","Data":"c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.681701 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerDied","Data":"21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.681709 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerDied","Data":"a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.684703 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d","Type":"ContainerDied","Data":"fa781215a57c5a384dc9196151cb9d88b19a59e6ec4219a4b6443b0c5d96ab8f"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.684740 4795 scope.go:117] "RemoveContainer" containerID="ae770d1785f3d5deaa5b1b98adf315fd8a61a2cbce434ecb3970ab496579b196" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.684858 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.698008 4795 generic.go:334] "Generic (PLEG): container finished" podID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerID="067a3784bb90d910b6b73dca0dc993d5c4844e46f49fddffcbfe6f467e1645d3" exitCode=0 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.698066 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d2561f4e-0a01-4927-96f8-ee7bef69f561","Type":"ContainerDied","Data":"067a3784bb90d910b6b73dca0dc993d5c4844e46f49fddffcbfe6f467e1645d3"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.700387 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f98bf9994-pr48x" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:49366->10.217.0.164:9311: read: connection reset by peer" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.700393 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6f98bf9994-pr48x" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:49372->10.217.0.164:9311: read: connection reset by peer" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.704150 4795 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-ktl2b" secret="" err="secret \"galera-openstack-dockercfg-snswc\" not found" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.704222 4795 scope.go:117] "RemoveContainer" containerID="37b5ade45a3dd017a14fe70f806fa2f9b2438f7ec74aeb16d3207b0a96e2916a" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.704490 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-ktl2b_openstack(2164f9d1-1d8b-486b-beca-0d3a5172b302)\"" pod="openstack/root-account-create-update-ktl2b" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.707945 4795 generic.go:334] "Generic (PLEG): container finished" podID="4532c069-4eb7-48ab-b575-b6a130e2b438" containerID="b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075" exitCode=0 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.707999 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" event={"ID":"4532c069-4eb7-48ab-b575-b6a130e2b438","Type":"ContainerDied","Data":"b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.708024 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" event={"ID":"4532c069-4eb7-48ab-b575-b6a130e2b438","Type":"ContainerDied","Data":"c0b18fd78cd092c133f6dd779fd8c2b41870a6c99e45b8bcd625ff594cb4d9de"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.708099 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b6d98fbd4-svzc8" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.712476 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.712525 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f6fd7841-2a08-4786-8e96-b2ab0f477eff","Type":"ContainerDied","Data":"d9ff2d99d6679f8062f4938a68a22506ca0912d3767e7bc32eff175a9e262c96"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.734634 4795 scope.go:117] "RemoveContainer" containerID="026030d64ab176967289d43803e975c62df885e093dcaebf727c13a2d67da5ff" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.735122 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3697a3b0-4077-4837-bcdc-c17d8aa361f1","Type":"ContainerDied","Data":"c6abf78f9f811ce98af3de204165d6af86666923e425da520b7a47fdc3944ee7"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.735361 4795 generic.go:334] "Generic (PLEG): container finished" podID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerID="c6abf78f9f811ce98af3de204165d6af86666923e425da520b7a47fdc3944ee7" exitCode=0 Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.741208 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f769-account-create-update-8k7r2" event={"ID":"8eaa69df-d563-4dc0-8a78-40413946cbca","Type":"ContainerDied","Data":"7d6e2a5c198e5bc708e1811f1860c10bf53d61ca572544f39bca6ba9c9264ae5"} Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.741244 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d6e2a5c198e5bc708e1811f1860c10bf53d61ca572544f39bca6ba9c9264ae5" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.778085 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.779823 4795 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.779871 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts podName:2164f9d1-1d8b-486b-beca-0d3a5172b302 nodeName:}" failed. No retries permitted until 2026-02-19 21:50:00.279857894 +0000 UTC m=+1311.472375758 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts") pod "root-account-create-update-ktl2b" (UID: "2164f9d1-1d8b-486b-beca-0d3a5172b302") : configmap "openstack-scripts" not found Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.782061 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-gzkhl"] Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.784813 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-qcb7p operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-41fb-account-create-update-h4ql2" podUID="38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.793143 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.795144 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-1922-account-create-update-gzkhl"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.808896 4795 scope.go:117] "RemoveContainer" containerID="b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.831320 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2d62-account-create-update-4vs7v"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.842496 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2d62-account-create-update-4vs7v"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.857107 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6b6d98fbd4-svzc8"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.866569 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6b6d98fbd4-svzc8"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.867017 4795 scope.go:117] "RemoveContainer" containerID="374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.893785 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.909960 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.915182 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.941672 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.946877 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990456 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-config-data\") pod \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990496 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wfxq\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-kube-api-access-2wfxq\") pod \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990529 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-internal-tls-certs\") pod \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990555 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-etc-swift\") pod \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990584 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-run-httpd\") pod \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-public-tls-certs\") pod \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990654 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-log-httpd\") pod \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990714 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eaa69df-d563-4dc0-8a78-40413946cbca-operator-scripts\") pod \"8eaa69df-d563-4dc0-8a78-40413946cbca\" (UID: \"8eaa69df-d563-4dc0-8a78-40413946cbca\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990830 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hkps\" (UniqueName: \"kubernetes.io/projected/8eaa69df-d563-4dc0-8a78-40413946cbca-kube-api-access-5hkps\") pod \"8eaa69df-d563-4dc0-8a78-40413946cbca\" (UID: \"8eaa69df-d563-4dc0-8a78-40413946cbca\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.990851 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-combined-ca-bundle\") pod \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\" (UID: \"da2e3f89-bf0b-4371-8e5b-a0037f266c70\") " Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.991047 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts\") pod \"keystone-41fb-account-create-update-h4ql2\" (UID: \"38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9\") " pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.991116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcb7p\" (UniqueName: \"kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p\") pod \"keystone-41fb-account-create-update-h4ql2\" (UID: \"38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9\") " pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.995290 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eaa69df-d563-4dc0-8a78-40413946cbca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8eaa69df-d563-4dc0-8a78-40413946cbca" (UID: "8eaa69df-d563-4dc0-8a78-40413946cbca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.995485 4795 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.995533 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts podName:38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9 nodeName:}" failed. No retries permitted until 2026-02-19 21:50:00.995518176 +0000 UTC m=+1312.188036040 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts") pod "keystone-41fb-account-create-update-h4ql2" (UID: "38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9") : configmap "openstack-scripts" not found Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.997544 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "da2e3f89-bf0b-4371-8e5b-a0037f266c70" (UID: "da2e3f89-bf0b-4371-8e5b-a0037f266c70"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.998484 4795 projected.go:194] Error preparing data for projected volume kube-api-access-qcb7p for pod openstack/keystone-41fb-account-create-update-h4ql2: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:49:59 crc kubenswrapper[4795]: E0219 21:49:59.998578 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p podName:38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9 nodeName:}" failed. No retries permitted until 2026-02-19 21:50:00.998567222 +0000 UTC m=+1312.191085086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qcb7p" (UniqueName: "kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p") pod "keystone-41fb-account-create-update-h4ql2" (UID: "38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:49:59 crc kubenswrapper[4795]: I0219 21:49:59.998766 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "da2e3f89-bf0b-4371-8e5b-a0037f266c70" (UID: "da2e3f89-bf0b-4371-8e5b-a0037f266c70"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.005634 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eaa69df-d563-4dc0-8a78-40413946cbca-kube-api-access-5hkps" (OuterVolumeSpecName: "kube-api-access-5hkps") pod "8eaa69df-d563-4dc0-8a78-40413946cbca" (UID: "8eaa69df-d563-4dc0-8a78-40413946cbca"). InnerVolumeSpecName "kube-api-access-5hkps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.012637 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "da2e3f89-bf0b-4371-8e5b-a0037f266c70" (UID: "da2e3f89-bf0b-4371-8e5b-a0037f266c70"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.014323 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-kube-api-access-2wfxq" (OuterVolumeSpecName: "kube-api-access-2wfxq") pod "da2e3f89-bf0b-4371-8e5b-a0037f266c70" (UID: "da2e3f89-bf0b-4371-8e5b-a0037f266c70"). InnerVolumeSpecName "kube-api-access-2wfxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.025372 4795 scope.go:117] "RemoveContainer" containerID="b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075" Feb 19 21:50:00 crc kubenswrapper[4795]: E0219 21:50:00.033263 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075\": container with ID starting with b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075 not found: ID does not exist" containerID="b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.033330 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075"} err="failed to get container status \"b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075\": rpc error: code = NotFound desc = could not find container \"b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075\": container with ID starting with b139002ae58e716fbddbfc8a4fe687ac4f1df52f0c26c4b6314da64bed2d0075 not found: ID does not exist" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.033359 4795 scope.go:117] "RemoveContainer" containerID="374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef" Feb 19 21:50:00 crc kubenswrapper[4795]: E0219 21:50:00.034287 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef\": container with ID starting with 374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef not found: ID does not exist" containerID="374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.034431 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef"} err="failed to get container status \"374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef\": rpc error: code = NotFound desc = could not find container \"374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef\": container with ID starting with 374356bcc0d8632ce08f62d54e2492534b8e8fc2a40b07f92483295295d884ef not found: ID does not exist" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.034452 4795 scope.go:117] "RemoveContainer" containerID="4f511a80129a9b198e5be1216670e966f0623d26f1c7d17cc3a99a813e69b1c3" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.079113 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-config-data" (OuterVolumeSpecName: "config-data") pod "da2e3f89-bf0b-4371-8e5b-a0037f266c70" (UID: "da2e3f89-bf0b-4371-8e5b-a0037f266c70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.080967 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "da2e3f89-bf0b-4371-8e5b-a0037f266c70" (UID: "da2e3f89-bf0b-4371-8e5b-a0037f266c70"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.089185 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "da2e3f89-bf0b-4371-8e5b-a0037f266c70" (UID: "da2e3f89-bf0b-4371-8e5b-a0037f266c70"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095151 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-public-tls-certs\") pod \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095210 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-scripts\") pod \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095245 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-logs\") pod \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095314 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-combined-ca-bundle\") pod \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095393 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-config-data\") pod \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-internal-tls-certs\") pod \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095487 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84vz2\" (UniqueName: \"kubernetes.io/projected/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-kube-api-access-84vz2\") pod \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\" (UID: \"b01bcd5b-435a-4702-b0a4-8dfe8f553c23\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095849 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hkps\" (UniqueName: \"kubernetes.io/projected/8eaa69df-d563-4dc0-8a78-40413946cbca-kube-api-access-5hkps\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095862 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095871 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wfxq\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-kube-api-access-2wfxq\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095879 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095887 4795 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/da2e3f89-bf0b-4371-8e5b-a0037f266c70-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095895 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095902 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095910 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da2e3f89-bf0b-4371-8e5b-a0037f266c70-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.095918 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eaa69df-d563-4dc0-8a78-40413946cbca-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.098752 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-logs" (OuterVolumeSpecName: "logs") pod "b01bcd5b-435a-4702-b0a4-8dfe8f553c23" (UID: "b01bcd5b-435a-4702-b0a4-8dfe8f553c23"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.108640 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-kube-api-access-84vz2" (OuterVolumeSpecName: "kube-api-access-84vz2") pod "b01bcd5b-435a-4702-b0a4-8dfe8f553c23" (UID: "b01bcd5b-435a-4702-b0a4-8dfe8f553c23"). InnerVolumeSpecName "kube-api-access-84vz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.118020 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-scripts" (OuterVolumeSpecName: "scripts") pod "b01bcd5b-435a-4702-b0a4-8dfe8f553c23" (UID: "b01bcd5b-435a-4702-b0a4-8dfe8f553c23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.130255 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da2e3f89-bf0b-4371-8e5b-a0037f266c70" (UID: "da2e3f89-bf0b-4371-8e5b-a0037f266c70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.197006 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da2e3f89-bf0b-4371-8e5b-a0037f266c70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.197033 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84vz2\" (UniqueName: \"kubernetes.io/projected/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-kube-api-access-84vz2\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.197043 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.197051 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.202192 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.291291 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b01bcd5b-435a-4702-b0a4-8dfe8f553c23" (UID: "b01bcd5b-435a-4702-b0a4-8dfe8f553c23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.297629 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-config-data" (OuterVolumeSpecName: "config-data") pod "b01bcd5b-435a-4702-b0a4-8dfe8f553c23" (UID: "b01bcd5b-435a-4702-b0a4-8dfe8f553c23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.297771 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.297908 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-logs\") pod \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.298017 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg5wb\" (UniqueName: \"kubernetes.io/projected/3697a3b0-4077-4837-bcdc-c17d8aa361f1-kube-api-access-bg5wb\") pod \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.298056 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-httpd-run\") pod \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.298138 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-config-data\") pod \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.298482 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-combined-ca-bundle\") pod \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.298552 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-scripts\") pod \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.298608 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-internal-tls-certs\") pod \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\" (UID: \"3697a3b0-4077-4837-bcdc-c17d8aa361f1\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.300510 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3697a3b0-4077-4837-bcdc-c17d8aa361f1" (UID: "3697a3b0-4077-4837-bcdc-c17d8aa361f1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: E0219 21:50:00.302532 4795 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 19 21:50:00 crc kubenswrapper[4795]: E0219 21:50:00.303615 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts podName:2164f9d1-1d8b-486b-beca-0d3a5172b302 nodeName:}" failed. No retries permitted until 2026-02-19 21:50:01.303447441 +0000 UTC m=+1312.495965305 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts") pod "root-account-create-update-ktl2b" (UID: "2164f9d1-1d8b-486b-beca-0d3a5172b302") : configmap "openstack-scripts" not found Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.303305 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.303891 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.304014 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.304886 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3697a3b0-4077-4837-bcdc-c17d8aa361f1-kube-api-access-bg5wb" (OuterVolumeSpecName: "kube-api-access-bg5wb") pod "3697a3b0-4077-4837-bcdc-c17d8aa361f1" (UID: "3697a3b0-4077-4837-bcdc-c17d8aa361f1"). InnerVolumeSpecName "kube-api-access-bg5wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.306419 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-logs" (OuterVolumeSpecName: "logs") pod "3697a3b0-4077-4837-bcdc-c17d8aa361f1" (UID: "3697a3b0-4077-4837-bcdc-c17d8aa361f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.320302 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "3697a3b0-4077-4837-bcdc-c17d8aa361f1" (UID: "3697a3b0-4077-4837-bcdc-c17d8aa361f1"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.325732 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-scripts" (OuterVolumeSpecName: "scripts") pod "3697a3b0-4077-4837-bcdc-c17d8aa361f1" (UID: "3697a3b0-4077-4837-bcdc-c17d8aa361f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.342363 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3697a3b0-4077-4837-bcdc-c17d8aa361f1" (UID: "3697a3b0-4077-4837-bcdc-c17d8aa361f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.369250 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b01bcd5b-435a-4702-b0a4-8dfe8f553c23" (UID: "b01bcd5b-435a-4702-b0a4-8dfe8f553c23"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.369351 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b01bcd5b-435a-4702-b0a4-8dfe8f553c23" (UID: "b01bcd5b-435a-4702-b0a4-8dfe8f553c23"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.370628 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-config-data" (OuterVolumeSpecName: "config-data") pod "3697a3b0-4077-4837-bcdc-c17d8aa361f1" (UID: "3697a3b0-4077-4837-bcdc-c17d8aa361f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.386921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3697a3b0-4077-4837-bcdc-c17d8aa361f1" (UID: "3697a3b0-4077-4837-bcdc-c17d8aa361f1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405524 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg5wb\" (UniqueName: \"kubernetes.io/projected/3697a3b0-4077-4837-bcdc-c17d8aa361f1-kube-api-access-bg5wb\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405559 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405572 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405583 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405595 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3697a3b0-4077-4837-bcdc-c17d8aa361f1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405620 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405631 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405641 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3697a3b0-4077-4837-bcdc-c17d8aa361f1-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.405650 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b01bcd5b-435a-4702-b0a4-8dfe8f553c23-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.423955 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.507471 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.563871 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.571946 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.618307 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.693394 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.695329 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.700034 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.710689 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj2cq\" (UniqueName: \"kubernetes.io/projected/d2561f4e-0a01-4927-96f8-ee7bef69f561-kube-api-access-gj2cq\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.710772 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-internal-tls-certs\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711131 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2561f4e-0a01-4927-96f8-ee7bef69f561-logs\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711225 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-public-tls-certs\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711251 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-certs\") pod \"296f6b57-de45-495d-abe9-8c779c157057\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711329 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2561f4e-0a01-4927-96f8-ee7bef69f561-etc-machine-id\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711355 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-config\") pod \"296f6b57-de45-495d-abe9-8c779c157057\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711375 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-combined-ca-bundle\") pod \"296f6b57-de45-495d-abe9-8c779c157057\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711421 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data-custom\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711445 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlmk9\" (UniqueName: \"kubernetes.io/projected/296f6b57-de45-495d-abe9-8c779c157057-kube-api-access-zlmk9\") pod \"296f6b57-de45-495d-abe9-8c779c157057\" (UID: \"296f6b57-de45-495d-abe9-8c779c157057\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711464 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-scripts\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711493 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-combined-ca-bundle\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.711512 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data\") pod \"d2561f4e-0a01-4927-96f8-ee7bef69f561\" (UID: \"d2561f4e-0a01-4927-96f8-ee7bef69f561\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.718182 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2561f4e-0a01-4927-96f8-ee7bef69f561-logs" (OuterVolumeSpecName: "logs") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.718520 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2561f4e-0a01-4927-96f8-ee7bef69f561-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.719130 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.722106 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-scripts" (OuterVolumeSpecName: "scripts") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.728214 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2561f4e-0a01-4927-96f8-ee7bef69f561-kube-api-access-gj2cq" (OuterVolumeSpecName: "kube-api-access-gj2cq") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "kube-api-access-gj2cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.734260 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7677694455-vk4hv" podUID="74df8ac0-77f2-4e8d-aa39-d05dc6ce7190" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.201:5353: i/o timeout" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.754456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3697a3b0-4077-4837-bcdc-c17d8aa361f1","Type":"ContainerDied","Data":"4b0f0860894e220641b68db9b622d33a66b09008a18bfa19149efafa413199c3"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.754510 4795 scope.go:117] "RemoveContainer" containerID="c6abf78f9f811ce98af3de204165d6af86666923e425da520b7a47fdc3944ee7" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.754607 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.766805 4795 generic.go:334] "Generic (PLEG): container finished" podID="42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" containerID="db08b177837bd80146274836c0977b72219e93219854a2c6c5e81a24922a33fe" exitCode=0 Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.766898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3","Type":"ContainerDied","Data":"db08b177837bd80146274836c0977b72219e93219854a2c6c5e81a24922a33fe"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.768785 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6df95dfbd4-ftf6x" event={"ID":"b01bcd5b-435a-4702-b0a4-8dfe8f553c23","Type":"ContainerDied","Data":"47caa9a7519cc7b778b03d7e938c02973816e703da61178a9af0e7d1bdc77812"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.768867 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6df95dfbd4-ftf6x" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.779266 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/296f6b57-de45-495d-abe9-8c779c157057-kube-api-access-zlmk9" (OuterVolumeSpecName: "kube-api-access-zlmk9") pod "296f6b57-de45-495d-abe9-8c779c157057" (UID: "296f6b57-de45-495d-abe9-8c779c157057"). InnerVolumeSpecName "kube-api-access-zlmk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.784045 4795 generic.go:334] "Generic (PLEG): container finished" podID="793bbadc-8b53-4084-a63a-0b76b37284df" containerID="5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70" exitCode=0 Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.784108 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"793bbadc-8b53-4084-a63a-0b76b37284df","Type":"ContainerDied","Data":"5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.784129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"793bbadc-8b53-4084-a63a-0b76b37284df","Type":"ContainerDied","Data":"14cc3bd11f9f9a1b0b976a11e87f576616488b4c4d4dfa8a49e1d97fcc43ddfd"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.784192 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.794128 4795 scope.go:117] "RemoveContainer" containerID="e797be576a87ea7d2cd1a10d4fb93c6e0f25a6a5bebf1abb85c7b6e12aa13e38" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.795459 4795 generic.go:334] "Generic (PLEG): container finished" podID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerID="708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217" exitCode=0 Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.795521 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.795567 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9","Type":"ContainerDied","Data":"708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.795590 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9","Type":"ContainerDied","Data":"6d8d28f68ae7a05b3b24448d485df065e39bc0509b04817346db5c0af58598b8"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.809909 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.810000 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"296f6b57-de45-495d-abe9-8c779c157057","Type":"ContainerDied","Data":"b8dfa3bc80470864a239275349fad5e129ee2b5e9ff5a093105f56a1aaa790d3"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.811999 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.812885 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/793bbadc-8b53-4084-a63a-0b76b37284df-logs\") pod \"793bbadc-8b53-4084-a63a-0b76b37284df\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.812948 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-config-data\") pod \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.812986 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-public-tls-certs\") pod \"793bbadc-8b53-4084-a63a-0b76b37284df\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813016 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-internal-tls-certs\") pod \"e4a6a069-904a-4072-b98c-346f67f22def\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813057 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-public-tls-certs\") pod \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813122 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-config-data\") pod \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813185 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6pv9\" (UniqueName: \"kubernetes.io/projected/e4a6a069-904a-4072-b98c-346f67f22def-kube-api-access-h6pv9\") pod \"e4a6a069-904a-4072-b98c-346f67f22def\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8fsc\" (UniqueName: \"kubernetes.io/projected/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-kube-api-access-x8fsc\") pod \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813252 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sbjg\" (UniqueName: \"kubernetes.io/projected/793bbadc-8b53-4084-a63a-0b76b37284df-kube-api-access-5sbjg\") pod \"793bbadc-8b53-4084-a63a-0b76b37284df\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813282 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-scripts\") pod \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813301 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-logs\") pod \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813334 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-combined-ca-bundle\") pod \"e4a6a069-904a-4072-b98c-346f67f22def\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813354 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-nova-metadata-tls-certs\") pod \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813369 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-combined-ca-bundle\") pod \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813385 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data\") pod \"e4a6a069-904a-4072-b98c-346f67f22def\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813402 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-combined-ca-bundle\") pod \"793bbadc-8b53-4084-a63a-0b76b37284df\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813427 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-httpd-run\") pod \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813448 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-public-tls-certs\") pod \"e4a6a069-904a-4072-b98c-346f67f22def\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813463 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-internal-tls-certs\") pod \"793bbadc-8b53-4084-a63a-0b76b37284df\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813509 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxzt2\" (UniqueName: \"kubernetes.io/projected/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-kube-api-access-hxzt2\") pod \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813535 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-logs\") pod \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\" (UID: \"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813556 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813575 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data-custom\") pod \"e4a6a069-904a-4072-b98c-346f67f22def\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813592 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-config-data\") pod \"793bbadc-8b53-4084-a63a-0b76b37284df\" (UID: \"793bbadc-8b53-4084-a63a-0b76b37284df\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813608 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-combined-ca-bundle\") pod \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\" (UID: \"a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813629 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a069-904a-4072-b98c-346f67f22def-logs\") pod \"e4a6a069-904a-4072-b98c-346f67f22def\" (UID: \"e4a6a069-904a-4072-b98c-346f67f22def\") " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813944 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2561f4e-0a01-4927-96f8-ee7bef69f561-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813957 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813969 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlmk9\" (UniqueName: \"kubernetes.io/projected/296f6b57-de45-495d-abe9-8c779c157057-kube-api-access-zlmk9\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.813978 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.814092 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj2cq\" (UniqueName: \"kubernetes.io/projected/d2561f4e-0a01-4927-96f8-ee7bef69f561-kube-api-access-gj2cq\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.814106 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2561f4e-0a01-4927-96f8-ee7bef69f561-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.814620 4795 generic.go:334] "Generic (PLEG): container finished" podID="57b83043-2f7c-4b55-a2b9-66eef96f0008" containerID="4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6" exitCode=0 Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.814728 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a6a069-904a-4072-b98c-346f67f22def-logs" (OuterVolumeSpecName: "logs") pod "e4a6a069-904a-4072-b98c-346f67f22def" (UID: "e4a6a069-904a-4072-b98c-346f67f22def"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.814728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"57b83043-2f7c-4b55-a2b9-66eef96f0008","Type":"ContainerDied","Data":"4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.815250 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/793bbadc-8b53-4084-a63a-0b76b37284df-logs" (OuterVolumeSpecName: "logs") pod "793bbadc-8b53-4084-a63a-0b76b37284df" (UID: "793bbadc-8b53-4084-a63a-0b76b37284df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.816231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-logs" (OuterVolumeSpecName: "logs") pod "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" (UID: "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.816449 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" (UID: "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.817392 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-logs" (OuterVolumeSpecName: "logs") pod "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" (UID: "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.818117 4795 generic.go:334] "Generic (PLEG): container finished" podID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerID="11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5" exitCode=0 Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.818191 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107","Type":"ContainerDied","Data":"11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.818213 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107","Type":"ContainerDied","Data":"d49ebe84c757eba1bd2bd142f49a19380b1d9884de4a65242a0e36933f808c52"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.818276 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.821404 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d2561f4e-0a01-4927-96f8-ee7bef69f561","Type":"ContainerDied","Data":"72a722e4ebd20de4e2ab880d4812af758e115c7dc2dbe4b6fadf7ad0adda880d"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.821441 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.823299 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.834864 4795 generic.go:334] "Generic (PLEG): container finished" podID="e4a6a069-904a-4072-b98c-346f67f22def" containerID="a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429" exitCode=0 Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.834912 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f98bf9994-pr48x" event={"ID":"e4a6a069-904a-4072-b98c-346f67f22def","Type":"ContainerDied","Data":"a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.834937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f98bf9994-pr48x" event={"ID":"e4a6a069-904a-4072-b98c-346f67f22def","Type":"ContainerDied","Data":"95eca73f9943de18ca7dd19f1ef5d95e39ab42d81563dce332afbfa7377d20f4"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.834987 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f98bf9994-pr48x" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.837614 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/793bbadc-8b53-4084-a63a-0b76b37284df-kube-api-access-5sbjg" (OuterVolumeSpecName: "kube-api-access-5sbjg") pod "793bbadc-8b53-4084-a63a-0b76b37284df" (UID: "793bbadc-8b53-4084-a63a-0b76b37284df"). InnerVolumeSpecName "kube-api-access-5sbjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.838257 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.840577 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.841019 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-858c4dcd57-whkj2" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.841327 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f769-account-create-update-8k7r2" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.842310 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-kube-api-access-x8fsc" (OuterVolumeSpecName: "kube-api-access-x8fsc") pod "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" (UID: "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9"). InnerVolumeSpecName "kube-api-access-x8fsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.842409 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-858c4dcd57-whkj2" event={"ID":"da2e3f89-bf0b-4371-8e5b-a0037f266c70","Type":"ContainerDied","Data":"fd10d4f85e04ded895f7718dd53443f09a3be089bf6f4718e6d017852d997436"} Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.845643 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" (UID: "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.860303 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a6a069-904a-4072-b98c-346f67f22def-kube-api-access-h6pv9" (OuterVolumeSpecName: "kube-api-access-h6pv9") pod "e4a6a069-904a-4072-b98c-346f67f22def" (UID: "e4a6a069-904a-4072-b98c-346f67f22def"). InnerVolumeSpecName "kube-api-access-h6pv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.862936 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-kube-api-access-hxzt2" (OuterVolumeSpecName: "kube-api-access-hxzt2") pod "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" (UID: "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107"). InnerVolumeSpecName "kube-api-access-hxzt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.864362 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e4a6a069-904a-4072-b98c-346f67f22def" (UID: "e4a6a069-904a-4072-b98c-346f67f22def"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.887858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-scripts" (OuterVolumeSpecName: "scripts") pod "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" (UID: "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: E0219 21:50:00.892730 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6 is running failed: container process not found" containerID="4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:00 crc kubenswrapper[4795]: E0219 21:50:00.906730 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6 is running failed: container process not found" containerID="4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:00 crc kubenswrapper[4795]: E0219 21:50:00.907508 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6 is running failed: container process not found" containerID="4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 21:50:00 crc kubenswrapper[4795]: E0219 21:50:00.907537 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="57b83043-2f7c-4b55-a2b9-66eef96f0008" containerName="nova-cell0-conductor-conductor" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.912660 4795 scope.go:117] "RemoveContainer" containerID="9d2e881624f7800e5e6e027e5487084188713e96e14cab5c281c53ae829851a2" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917403 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sbjg\" (UniqueName: \"kubernetes.io/projected/793bbadc-8b53-4084-a63a-0b76b37284df-kube-api-access-5sbjg\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917428 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917438 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917446 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917455 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxzt2\" (UniqueName: \"kubernetes.io/projected/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-kube-api-access-hxzt2\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917465 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917484 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917493 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917502 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4a6a069-904a-4072-b98c-346f67f22def-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917510 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/793bbadc-8b53-4084-a63a-0b76b37284df-logs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917519 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6pv9\" (UniqueName: \"kubernetes.io/projected/e4a6a069-904a-4072-b98c-346f67f22def-kube-api-access-h6pv9\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917527 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8fsc\" (UniqueName: \"kubernetes.io/projected/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-kube-api-access-x8fsc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.917747 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6df95dfbd4-ftf6x"] Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.925384 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "296f6b57-de45-495d-abe9-8c779c157057" (UID: "296f6b57-de45-495d-abe9-8c779c157057"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.932492 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.933666 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6df95dfbd4-ftf6x"] Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.941328 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "296f6b57-de45-495d-abe9-8c779c157057" (UID: "296f6b57-de45-495d-abe9-8c779c157057"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.965097 4795 scope.go:117] "RemoveContainer" containerID="16d1434f729c32f7f5af098d1664dafb8ed3d4636079462bf6c45b1454ee08ef" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.974044 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.982002 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-858c4dcd57-whkj2"] Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.983621 4795 scope.go:117] "RemoveContainer" containerID="5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70" Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.988779 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-858c4dcd57-whkj2"] Feb 19 21:50:00 crc kubenswrapper[4795]: I0219 21:50:00.994636 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "296f6b57-de45-495d-abe9-8c779c157057" (UID: "296f6b57-de45-495d-abe9-8c779c157057"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.006256 4795 scope.go:117] "RemoveContainer" containerID="4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.019257 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-memcached-tls-certs\") pod \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.019333 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csp2v\" (UniqueName: \"kubernetes.io/projected/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kube-api-access-csp2v\") pod \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.019474 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-combined-ca-bundle\") pod \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.019605 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-config-data\") pod \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.019644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kolla-config\") pod \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\" (UID: \"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.019869 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts\") pod \"keystone-41fb-account-create-update-h4ql2\" (UID: \"38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9\") " pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.019927 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcb7p\" (UniqueName: \"kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p\") pod \"keystone-41fb-account-create-update-h4ql2\" (UID: \"38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9\") " pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.020015 4795 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.020030 4795 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.020040 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/296f6b57-de45-495d-abe9-8c779c157057-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.020051 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.020886 4795 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.020984 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts podName:38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9 nodeName:}" failed. No retries permitted until 2026-02-19 21:50:03.020961447 +0000 UTC m=+1314.213479311 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts") pod "keystone-41fb-account-create-update-h4ql2" (UID: "38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9") : configmap "openstack-scripts" not found Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.023411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-config-data" (OuterVolumeSpecName: "config-data") pod "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" (UID: "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.023822 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" (UID: "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.023838 4795 projected.go:194] Error preparing data for projected volume kube-api-access-qcb7p for pod openstack/keystone-41fb-account-create-update-h4ql2: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.024733 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p podName:38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9 nodeName:}" failed. No retries permitted until 2026-02-19 21:50:03.02423617 +0000 UTC m=+1314.216754034 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qcb7p" (UniqueName: "kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p") pod "keystone-41fb-account-create-update-h4ql2" (UID: "38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.029409 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f769-account-create-update-8k7r2"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.034948 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.036699 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kube-api-access-csp2v" (OuterVolumeSpecName: "kube-api-access-csp2v") pod "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" (UID: "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3"). InnerVolumeSpecName "kube-api-access-csp2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.043179 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f769-account-create-update-8k7r2"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.049864 4795 scope.go:117] "RemoveContainer" containerID="5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.050326 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70\": container with ID starting with 5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70 not found: ID does not exist" containerID="5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.050360 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70"} err="failed to get container status \"5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70\": rpc error: code = NotFound desc = could not find container \"5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70\": container with ID starting with 5b12dfe428f5ad6bc0c6c39c5790008897c140f405635c55ad4d8157a86e3f70 not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.050379 4795 scope.go:117] "RemoveContainer" containerID="4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.050737 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681\": container with ID starting with 4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681 not found: ID does not exist" containerID="4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.050777 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681"} err="failed to get container status \"4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681\": rpc error: code = NotFound desc = could not find container \"4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681\": container with ID starting with 4a55d98b95e49937bea845358fb6e967ce1fb52ec526e4696c5d439369df7681 not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.050807 4795 scope.go:117] "RemoveContainer" containerID="708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.069316 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" (UID: "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.077004 4795 scope.go:117] "RemoveContainer" containerID="46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.090247 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.095049 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "793bbadc-8b53-4084-a63a-0b76b37284df" (UID: "793bbadc-8b53-4084-a63a-0b76b37284df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.098425 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-config-data" (OuterVolumeSpecName: "config-data") pod "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" (UID: "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.100305 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-config-data" (OuterVolumeSpecName: "config-data") pod "793bbadc-8b53-4084-a63a-0b76b37284df" (UID: "793bbadc-8b53-4084-a63a-0b76b37284df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.120405 4795 scope.go:117] "RemoveContainer" containerID="708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122724 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122742 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csp2v\" (UniqueName: \"kubernetes.io/projected/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kube-api-access-csp2v\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122751 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122760 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122772 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122782 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122790 4795 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122799 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.122810 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.126261 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217\": container with ID starting with 708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217 not found: ID does not exist" containerID="708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.126290 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217"} err="failed to get container status \"708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217\": rpc error: code = NotFound desc = could not find container \"708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217\": container with ID starting with 708e4d014b2d6219a997cb5284977900cb2aafb592d9023c95962ea2c3074217 not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.126319 4795 scope.go:117] "RemoveContainer" containerID="46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.127471 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f\": container with ID starting with 46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f not found: ID does not exist" containerID="46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.127517 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f"} err="failed to get container status \"46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f\": rpc error: code = NotFound desc = could not find container \"46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f\": container with ID starting with 46cb4f812b6ae6e3f71ffef829cc67ce3250106de6851d56a2c280735844fd0f not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.127542 4795 scope.go:117] "RemoveContainer" containerID="5a4765254bd1f522ba57b50f9f6989619b42ec50e10b34373400dceff9cc29f7" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.147385 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4a6a069-904a-4072-b98c-346f67f22def" (UID: "e4a6a069-904a-4072-b98c-346f67f22def"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.147796 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.171483 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.174785 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e4a6a069-904a-4072-b98c-346f67f22def" (UID: "e4a6a069-904a-4072-b98c-346f67f22def"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.175154 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "793bbadc-8b53-4084-a63a-0b76b37284df" (UID: "793bbadc-8b53-4084-a63a-0b76b37284df"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.180593 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.186334 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" (UID: "e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.192595 4795 scope.go:117] "RemoveContainer" containerID="11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.207413 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.208929 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "793bbadc-8b53-4084-a63a-0b76b37284df" (UID: "793bbadc-8b53-4084-a63a-0b76b37284df"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.217920 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data" (OuterVolumeSpecName: "config-data") pod "d2561f4e-0a01-4927-96f8-ee7bef69f561" (UID: "d2561f4e-0a01-4927-96f8-ee7bef69f561"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.218318 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data" (OuterVolumeSpecName: "config-data") pod "e4a6a069-904a-4072-b98c-346f67f22def" (UID: "e4a6a069-904a-4072-b98c-346f67f22def"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.219053 4795 scope.go:117] "RemoveContainer" containerID="a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.220545 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ktl2b" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.223789 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2561f4e-0a01-4927-96f8-ee7bef69f561-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.223812 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.223821 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.223830 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.223838 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.223846 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.223854 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/793bbadc-8b53-4084-a63a-0b76b37284df-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.223862 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.257348 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" (UID: "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.279790 4795 scope.go:117] "RemoveContainer" containerID="11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.280418 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" (UID: "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.280438 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5\": container with ID starting with 11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5 not found: ID does not exist" containerID="11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.280490 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5"} err="failed to get container status \"11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5\": rpc error: code = NotFound desc = could not find container \"11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5\": container with ID starting with 11d0871b61a89a3e5545d5527d867256cc43829525a32af590e7a61ae24edec5 not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.280521 4795 scope.go:117] "RemoveContainer" containerID="a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.281195 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37\": container with ID starting with a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37 not found: ID does not exist" containerID="a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.281235 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37"} err="failed to get container status \"a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37\": rpc error: code = NotFound desc = could not find container \"a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37\": container with ID starting with a92c3489fb5665772be143ca85a31a0e43e2db5654864e03970155d992501b37 not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.281263 4795 scope.go:117] "RemoveContainer" containerID="067a3784bb90d910b6b73dca0dc993d5c4844e46f49fddffcbfe6f467e1645d3" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.300946 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" (UID: "42e81a78-17fd-4ed3-b072-dd6a20cbe5d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.303185 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-config-data" (OuterVolumeSpecName: "config-data") pod "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" (UID: "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.317069 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e4a6a069-904a-4072-b98c-346f67f22def" (UID: "e4a6a069-904a-4072-b98c-346f67f22def"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.323413 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" (UID: "a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331250 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kll6\" (UniqueName: \"kubernetes.io/projected/2164f9d1-1d8b-486b-beca-0d3a5172b302-kube-api-access-9kll6\") pod \"2164f9d1-1d8b-486b-beca-0d3a5172b302\" (UID: \"2164f9d1-1d8b-486b-beca-0d3a5172b302\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331299 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts\") pod \"2164f9d1-1d8b-486b-beca-0d3a5172b302\" (UID: \"2164f9d1-1d8b-486b-beca-0d3a5172b302\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331321 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-combined-ca-bundle\") pod \"57b83043-2f7c-4b55-a2b9-66eef96f0008\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331385 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-config-data\") pod \"57b83043-2f7c-4b55-a2b9-66eef96f0008\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331430 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v44bq\" (UniqueName: \"kubernetes.io/projected/57b83043-2f7c-4b55-a2b9-66eef96f0008-kube-api-access-v44bq\") pod \"57b83043-2f7c-4b55-a2b9-66eef96f0008\" (UID: \"57b83043-2f7c-4b55-a2b9-66eef96f0008\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331661 4795 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331676 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331689 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a6a069-904a-4072-b98c-346f67f22def-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331697 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331705 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.331713 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.334701 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2164f9d1-1d8b-486b-beca-0d3a5172b302" (UID: "2164f9d1-1d8b-486b-beca-0d3a5172b302"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.335867 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b83043-2f7c-4b55-a2b9-66eef96f0008-kube-api-access-v44bq" (OuterVolumeSpecName: "kube-api-access-v44bq") pod "57b83043-2f7c-4b55-a2b9-66eef96f0008" (UID: "57b83043-2f7c-4b55-a2b9-66eef96f0008"). InnerVolumeSpecName "kube-api-access-v44bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.338728 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2164f9d1-1d8b-486b-beca-0d3a5172b302-kube-api-access-9kll6" (OuterVolumeSpecName: "kube-api-access-9kll6") pod "2164f9d1-1d8b-486b-beca-0d3a5172b302" (UID: "2164f9d1-1d8b-486b-beca-0d3a5172b302"). InnerVolumeSpecName "kube-api-access-9kll6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.364279 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-config-data" (OuterVolumeSpecName: "config-data") pod "57b83043-2f7c-4b55-a2b9-66eef96f0008" (UID: "57b83043-2f7c-4b55-a2b9-66eef96f0008"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.367370 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57b83043-2f7c-4b55-a2b9-66eef96f0008" (UID: "57b83043-2f7c-4b55-a2b9-66eef96f0008"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.433248 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.433678 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v44bq\" (UniqueName: \"kubernetes.io/projected/57b83043-2f7c-4b55-a2b9-66eef96f0008-kube-api-access-v44bq\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.433690 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kll6\" (UniqueName: \"kubernetes.io/projected/2164f9d1-1d8b-486b-beca-0d3a5172b302-kube-api-access-9kll6\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.433702 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2164f9d1-1d8b-486b-beca-0d3a5172b302-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.433711 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57b83043-2f7c-4b55-a2b9-66eef96f0008-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.523909 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="296f6b57-de45-495d-abe9-8c779c157057" path="/var/lib/kubelet/pods/296f6b57-de45-495d-abe9-8c779c157057/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.524527 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="299d8d1c-c181-4c7b-b95f-9f3c62ddb102" path="/var/lib/kubelet/pods/299d8d1c-c181-4c7b-b95f-9f3c62ddb102/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.525768 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d" path="/var/lib/kubelet/pods/2ff8e7ea-b1fa-4de5-ba32-d6c4b1d67a6d/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.527048 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" path="/var/lib/kubelet/pods/3697a3b0-4077-4837-bcdc-c17d8aa361f1/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.527718 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4532c069-4eb7-48ab-b575-b6a130e2b438" path="/var/lib/kubelet/pods/4532c069-4eb7-48ab-b575-b6a130e2b438/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.530891 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eaa69df-d563-4dc0-8a78-40413946cbca" path="/var/lib/kubelet/pods/8eaa69df-d563-4dc0-8a78-40413946cbca/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.531404 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" path="/var/lib/kubelet/pods/b01bcd5b-435a-4702-b0a4-8dfe8f553c23/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.537610 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" path="/var/lib/kubelet/pods/da2e3f89-bf0b-4371-8e5b-a0037f266c70/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.538554 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee3fde95-91bf-4f6a-9753-f879d56fedbb" path="/var/lib/kubelet/pods/ee3fde95-91bf-4f6a-9753-f879d56fedbb/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.539135 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6fd7841-2a08-4786-8e96-b2ab0f477eff" path="/var/lib/kubelet/pods/f6fd7841-2a08-4786-8e96-b2ab0f477eff/volumes" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.570312 4795 scope.go:117] "RemoveContainer" containerID="46487241a29d4cc3bff33a03b2f13ce2e328740a30388d55d6c987233cf2d399" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.614508 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fe2e673c-4ce5-4df9-b9e6-0b7cea99727c/ovn-northd/0.log" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.614585 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.626616 4795 scope.go:117] "RemoveContainer" containerID="a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.637972 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxwxt\" (UniqueName: \"kubernetes.io/projected/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-kube-api-access-xxwxt\") pod \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.638216 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-config\") pod \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.638334 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-metrics-certs-tls-certs\") pod \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.638494 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-rundir\") pod \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.638521 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-scripts\") pod \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.639838 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-scripts" (OuterVolumeSpecName: "scripts") pod "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" (UID: "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.640281 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-config" (OuterVolumeSpecName: "config") pod "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" (UID: "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.640311 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.641654 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" (UID: "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.646934 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-kube-api-access-xxwxt" (OuterVolumeSpecName: "kube-api-access-xxwxt") pod "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" (UID: "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c"). InnerVolumeSpecName "kube-api-access-xxwxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.741947 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-combined-ca-bundle\") pod \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.742061 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-northd-tls-certs\") pod \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\" (UID: \"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.742663 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxwxt\" (UniqueName: \"kubernetes.io/projected/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-kube-api-access-xxwxt\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.742689 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.742703 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.751634 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" (UID: "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.766363 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" (UID: "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.796852 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.801263 4795 scope.go:117] "RemoveContainer" containerID="d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.819708 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.825799 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.841468 4795 scope.go:117] "RemoveContainer" containerID="a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.842870 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-combined-ca-bundle\") pod \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.842928 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-generated\") pod \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.842946 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-galera-tls-certs\") pod \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.843090 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-operator-scripts\") pod \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.843121 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kolla-config\") pod \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.843139 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtmqc\" (UniqueName: \"kubernetes.io/projected/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kube-api-access-gtmqc\") pod \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.843206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-default\") pod \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.843232 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\" (UID: \"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99\") " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.843471 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.843482 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.848772 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" (UID: "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.850384 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429\": container with ID starting with a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429 not found: ID does not exist" containerID="a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.850566 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429"} err="failed to get container status \"a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429\": rpc error: code = NotFound desc = could not find container \"a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429\": container with ID starting with a6cf86c6cb492d1373f389e62b5d8e8687a30a53e6a19fc785851a2a8a42b429 not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.850590 4795 scope.go:117] "RemoveContainer" containerID="d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.850744 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" (UID: "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.852560 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" (UID: "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.858794 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632\": container with ID starting with d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632 not found: ID does not exist" containerID="d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.859500 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632"} err="failed to get container status \"d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632\": rpc error: code = NotFound desc = could not find container \"d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632\": container with ID starting with d4e7caf152ae88ec057977570b5950355e310fd918c2c78401ccea557c71b632 not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.859537 4795 scope.go:117] "RemoveContainer" containerID="812bd35c706001e18c0da5e2c3dd17a059f42e984bdd2e12b92f8fd91195b1e2" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.861504 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kube-api-access-gtmqc" (OuterVolumeSpecName: "kube-api-access-gtmqc") pod "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" (UID: "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99"). InnerVolumeSpecName "kube-api-access-gtmqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.862349 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" (UID: "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.869445 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" (UID: "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.875058 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fe2e673c-4ce5-4df9-b9e6-0b7cea99727c/ovn-northd/0.log" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.875160 4795 generic.go:334] "Generic (PLEG): container finished" podID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerID="2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f" exitCode=139 Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.875353 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c","Type":"ContainerDied","Data":"2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f"} Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.875378 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fe2e673c-4ce5-4df9-b9e6-0b7cea99727c","Type":"ContainerDied","Data":"dbab531f1a8f22d58c44dcac6c6209fda329451de2d8664028adcfc876aa2507"} Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.875437 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.877299 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" (UID: "fe2e673c-4ce5-4df9-b9e6-0b7cea99727c"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.884717 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"57b83043-2f7c-4b55-a2b9-66eef96f0008","Type":"ContainerDied","Data":"83f719d65e236fae031c225d4f8065a2b4c198be5a5993edbe70af70bfebe600"} Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.884835 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.885657 4795 scope.go:117] "RemoveContainer" containerID="38273143a291266be6dd29c71788a99ae4aa366ccd575844722fcc6687631e66" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.885787 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" (UID: "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.891093 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.900512 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ktl2b" event={"ID":"2164f9d1-1d8b-486b-beca-0d3a5172b302","Type":"ContainerDied","Data":"0bc8d13f4092138cc363d9e77ad1f35f49f21dad6c940b0ffcd7de9f24d779fb"} Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.900586 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ktl2b" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.913222 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" (UID: "0bbc6c00-2fc9-42cb-9c5a-9a160903ae99"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.919531 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"42e81a78-17fd-4ed3-b072-dd6a20cbe5d3","Type":"ContainerDied","Data":"aada954b6c8106a5c25613b1c4b96d76ce41049aa7128aa357d9511f84c5abf0"} Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.919613 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.926107 4795 generic.go:334] "Generic (PLEG): container finished" podID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" containerID="0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153" exitCode=0 Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.926190 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41fb-account-create-update-h4ql2" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.926629 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.931328 4795 scope.go:117] "RemoveContainer" containerID="1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.931353 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99","Type":"ContainerDied","Data":"0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153"} Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.931394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0bbc6c00-2fc9-42cb-9c5a-9a160903ae99","Type":"ContainerDied","Data":"050b9a153d584bbd1ba63be9e7a93c951075127827418493ce2ba5e1d8a7ed20"} Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.931407 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.935175 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.950955 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.950986 4795 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.951339 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtmqc\" (UniqueName: \"kubernetes.io/projected/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-kube-api-access-gtmqc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.951354 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.951365 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.951397 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.951407 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.951417 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.951426 4795 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.958916 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.972743 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.973419 4795 scope.go:117] "RemoveContainer" containerID="2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.975844 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f98bf9994-pr48x"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.987333 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6f98bf9994-pr48x"] Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.995300 4795 scope.go:117] "RemoveContainer" containerID="1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.996368 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3\": container with ID starting with 1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3 not found: ID does not exist" containerID="1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.996400 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3"} err="failed to get container status \"1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3\": rpc error: code = NotFound desc = could not find container \"1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3\": container with ID starting with 1947803c21d9e4b80d454677a036fe260f1c04cd37249f1f066a15e8b611b1c3 not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.996421 4795 scope.go:117] "RemoveContainer" containerID="2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f" Feb 19 21:50:01 crc kubenswrapper[4795]: E0219 21:50:01.996769 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f\": container with ID starting with 2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f not found: ID does not exist" containerID="2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.996796 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f"} err="failed to get container status \"2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f\": rpc error: code = NotFound desc = could not find container \"2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f\": container with ID starting with 2cccec2e02e2c16f7cff73b24c7a62191291d809702ff698745b7a193cbe2c2f not found: ID does not exist" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.996821 4795 scope.go:117] "RemoveContainer" containerID="4f011f521b748247939cf0cf1c345e321c4287fa532812c9357da77b932cf4c6" Feb 19 21:50:01 crc kubenswrapper[4795]: I0219 21:50:01.997801 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.004356 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.010355 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.013203 4795 scope.go:117] "RemoveContainer" containerID="37b5ade45a3dd017a14fe70f806fa2f9b2438f7ec74aeb16d3207b0a96e2916a" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.013905 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.019570 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ktl2b"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.025359 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ktl2b"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.030927 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.033983 4795 scope.go:117] "RemoveContainer" containerID="db08b177837bd80146274836c0977b72219e93219854a2c6c5e81a24922a33fe" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.035391 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.063085 4795 scope.go:117] "RemoveContainer" containerID="0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.066100 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.068600 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-41fb-account-create-update-h4ql2"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.078059 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-41fb-account-create-update-h4ql2"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.080322 4795 scope.go:117] "RemoveContainer" containerID="106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.082755 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.088601 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.101775 4795 scope.go:117] "RemoveContainer" containerID="0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153" Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.105744 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153\": container with ID starting with 0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153 not found: ID does not exist" containerID="0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.105776 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153"} err="failed to get container status \"0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153\": rpc error: code = NotFound desc = could not find container \"0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153\": container with ID starting with 0f7b958a502b531c3fff207837d630d66468893d907fbb748e2bd58503327153 not found: ID does not exist" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.105798 4795 scope.go:117] "RemoveContainer" containerID="106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0" Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.106211 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0\": container with ID starting with 106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0 not found: ID does not exist" containerID="106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.106265 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0"} err="failed to get container status \"106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0\": rpc error: code = NotFound desc = could not find container \"106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0\": container with ID starting with 106dec830838772106b6621ed4bf25acb79616ec219d016a3ce4bb8a514abaf0 not found: ID does not exist" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.171371 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcb7p\" (UniqueName: \"kubernetes.io/projected/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-kube-api-access-qcb7p\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.209100 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.216417 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.273087 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.273140 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.273226 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data podName:ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d nodeName:}" failed. No retries permitted until 2026-02-19 21:50:10.273207406 +0000 UTC m=+1321.465725270 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data") pod "rabbitmq-cell1-server-0" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d") : configmap "rabbitmq-cell1-config-data" not found Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.628063 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.628494 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.629022 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.629047 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.630723 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.633252 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.634993 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:02 crc kubenswrapper[4795]: E0219 21:50:02.635027 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.800469 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.205:3000/\": dial tcp 10.217.0.205:3000: connect: connection refused" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.829891 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-w9fbs" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" probeResult="failure" output="command timed out" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.864058 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-w9fbs" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" probeResult="failure" output=< Feb 19 21:50:02 crc kubenswrapper[4795]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Feb 19 21:50:02 crc kubenswrapper[4795]: > Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.906022 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="0adadcd9-8949-443b-8042-d0d11191eae9" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.200:6080/vnc_lite.html\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.949606 4795 generic.go:334] "Generic (PLEG): container finished" podID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerID="5a6b19520891e7087129c9dfe002592444a956d213365be36d88dc721e7adc6e" exitCode=0 Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.949682 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d","Type":"ContainerDied","Data":"5a6b19520891e7087129c9dfe002592444a956d213365be36d88dc721e7adc6e"} Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.958060 4795 generic.go:334] "Generic (PLEG): container finished" podID="4b928260-ac65-479d-bd4b-f14b48d24ddb" containerID="77a5c881bfb4b3162733203d6d06eed351c1cde8f2967461288b978f94eeb5ba" exitCode=0 Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.958152 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6945f64f65-rnq2b" event={"ID":"4b928260-ac65-479d-bd4b-f14b48d24ddb","Type":"ContainerDied","Data":"77a5c881bfb4b3162733203d6d06eed351c1cde8f2967461288b978f94eeb5ba"} Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.958222 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6945f64f65-rnq2b" event={"ID":"4b928260-ac65-479d-bd4b-f14b48d24ddb","Type":"ContainerDied","Data":"bdfd7065946391c6d353cb9168c7226ec5dc670a459303e629e9266fcea79077"} Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.958233 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdfd7065946391c6d353cb9168c7226ec5dc670a459303e629e9266fcea79077" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.967562 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.987908 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-scripts\") pod \"4b928260-ac65-479d-bd4b-f14b48d24ddb\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.987956 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-combined-ca-bundle\") pod \"4b928260-ac65-479d-bd4b-f14b48d24ddb\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.988002 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-public-tls-certs\") pod \"4b928260-ac65-479d-bd4b-f14b48d24ddb\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.988031 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-internal-tls-certs\") pod \"4b928260-ac65-479d-bd4b-f14b48d24ddb\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.988062 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8hb5\" (UniqueName: \"kubernetes.io/projected/4b928260-ac65-479d-bd4b-f14b48d24ddb-kube-api-access-v8hb5\") pod \"4b928260-ac65-479d-bd4b-f14b48d24ddb\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.988086 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-fernet-keys\") pod \"4b928260-ac65-479d-bd4b-f14b48d24ddb\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.988130 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-credential-keys\") pod \"4b928260-ac65-479d-bd4b-f14b48d24ddb\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.988158 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-config-data\") pod \"4b928260-ac65-479d-bd4b-f14b48d24ddb\" (UID: \"4b928260-ac65-479d-bd4b-f14b48d24ddb\") " Feb 19 21:50:02 crc kubenswrapper[4795]: I0219 21:50:02.996095 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4b928260-ac65-479d-bd4b-f14b48d24ddb" (UID: "4b928260-ac65-479d-bd4b-f14b48d24ddb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.000606 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4b928260-ac65-479d-bd4b-f14b48d24ddb" (UID: "4b928260-ac65-479d-bd4b-f14b48d24ddb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.000878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b928260-ac65-479d-bd4b-f14b48d24ddb-kube-api-access-v8hb5" (OuterVolumeSpecName: "kube-api-access-v8hb5") pod "4b928260-ac65-479d-bd4b-f14b48d24ddb" (UID: "4b928260-ac65-479d-bd4b-f14b48d24ddb"). InnerVolumeSpecName "kube-api-access-v8hb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.008507 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-scripts" (OuterVolumeSpecName: "scripts") pod "4b928260-ac65-479d-bd4b-f14b48d24ddb" (UID: "4b928260-ac65-479d-bd4b-f14b48d24ddb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: E0219 21:50:03.014794 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.015072 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-config-data" (OuterVolumeSpecName: "config-data") pod "4b928260-ac65-479d-bd4b-f14b48d24ddb" (UID: "4b928260-ac65-479d-bd4b-f14b48d24ddb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: E0219 21:50:03.018074 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.020940 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b928260-ac65-479d-bd4b-f14b48d24ddb" (UID: "4b928260-ac65-479d-bd4b-f14b48d24ddb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: E0219 21:50:03.021176 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 21:50:03 crc kubenswrapper[4795]: E0219 21:50:03.021214 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f2710b23-7a5c-44cb-b916-9e08edc59636" containerName="nova-scheduler-scheduler" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.024112 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.035139 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4b928260-ac65-479d-bd4b-f14b48d24ddb" (UID: "4b928260-ac65-479d-bd4b-f14b48d24ddb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.041422 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4b928260-ac65-479d-bd4b-f14b48d24ddb" (UID: "4b928260-ac65-479d-bd4b-f14b48d24ddb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.088760 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-erlang-cookie\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.088799 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-confd\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.088854 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pjq4\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-kube-api-access-7pjq4\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.088871 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-erlang-cookie-secret\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.088893 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.088912 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-server-conf\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.088950 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-plugins\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089406 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-pod-info\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089430 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089448 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-tls\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089473 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-plugins-conf\") pod \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\" (UID: \"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089675 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089686 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089695 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089703 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.089711 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8hb5\" (UniqueName: \"kubernetes.io/projected/4b928260-ac65-479d-bd4b-f14b48d24ddb-kube-api-access-v8hb5\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.090204 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.090503 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.090517 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.090525 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b928260-ac65-479d-bd4b-f14b48d24ddb-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.090651 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.090745 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.091704 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-kube-api-access-7pjq4" (OuterVolumeSpecName: "kube-api-access-7pjq4") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "kube-api-access-7pjq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.091727 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.092389 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-pod-info" (OuterVolumeSpecName: "pod-info") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.093086 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.093783 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.114158 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data" (OuterVolumeSpecName: "config-data") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.123134 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-server-conf" (OuterVolumeSpecName: "server-conf") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.164178 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" (UID: "ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.191530 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pjq4\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-kube-api-access-7pjq4\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.191564 4795 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.191575 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: E0219 21:50:03.191570 4795 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 19 21:50:03 crc kubenswrapper[4795]: E0219 21:50:03.191664 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data podName:7b096325-542d-4ac6-8d16-8aa0937013b2 nodeName:}" failed. No retries permitted until 2026-02-19 21:50:11.191642569 +0000 UTC m=+1322.384160453 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data") pod "rabbitmq-server-0" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2") : configmap "rabbitmq-config-data" not found Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.191584 4795 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.192007 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.192024 4795 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.192052 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.192065 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.192078 4795 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.192091 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.192104 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.211754 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.293339 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.522297 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" path="/var/lib/kubelet/pods/0bbc6c00-2fc9-42cb-9c5a-9a160903ae99/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.523651 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" path="/var/lib/kubelet/pods/2164f9d1-1d8b-486b-beca-0d3a5172b302/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.524450 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9" path="/var/lib/kubelet/pods/38d073aa-4cb9-4ed8-9a1e-2a30ff8cc3b9/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.525454 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" path="/var/lib/kubelet/pods/42e81a78-17fd-4ed3-b072-dd6a20cbe5d3/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.527327 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57b83043-2f7c-4b55-a2b9-66eef96f0008" path="/var/lib/kubelet/pods/57b83043-2f7c-4b55-a2b9-66eef96f0008/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.528212 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" path="/var/lib/kubelet/pods/793bbadc-8b53-4084-a63a-0b76b37284df/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.530950 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" path="/var/lib/kubelet/pods/a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.532938 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" path="/var/lib/kubelet/pods/d2561f4e-0a01-4927-96f8-ee7bef69f561/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.534821 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" path="/var/lib/kubelet/pods/e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.535989 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a6a069-904a-4072-b98c-346f67f22def" path="/var/lib/kubelet/pods/e4a6a069-904a-4072-b98c-346f67f22def/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.537569 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" path="/var/lib/kubelet/pods/fe2e673c-4ce5-4df9-b9e6-0b7cea99727c/volumes" Feb 19 21:50:03 crc kubenswrapper[4795]: E0219 21:50:03.574993 4795 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 19 21:50:03 crc kubenswrapper[4795]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-02-19T21:49:57Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 19 21:50:03 crc kubenswrapper[4795]: /etc/init.d/functions: line 589: 386 Alarm clock "$@" Feb 19 21:50:03 crc kubenswrapper[4795]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-w9fbs" message=< Feb 19 21:50:03 crc kubenswrapper[4795]: Exiting ovn-controller (1) [FAILED] Feb 19 21:50:03 crc kubenswrapper[4795]: Killing ovn-controller (1) [ OK ] Feb 19 21:50:03 crc kubenswrapper[4795]: 2026-02-19T21:49:57Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 19 21:50:03 crc kubenswrapper[4795]: /etc/init.d/functions: line 589: 386 Alarm clock "$@" Feb 19 21:50:03 crc kubenswrapper[4795]: > Feb 19 21:50:03 crc kubenswrapper[4795]: E0219 21:50:03.575065 4795 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 19 21:50:03 crc kubenswrapper[4795]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-02-19T21:49:57Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 19 21:50:03 crc kubenswrapper[4795]: /etc/init.d/functions: line 589: 386 Alarm clock "$@" Feb 19 21:50:03 crc kubenswrapper[4795]: > pod="openstack/ovn-controller-w9fbs" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" containerID="cri-o://e6a343acb1888ef7b6cc47d70fee6b276aa66bddee8c2a595c2dde665a4a1a27" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.575156 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-w9fbs" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" containerID="cri-o://e6a343acb1888ef7b6cc47d70fee6b276aa66bddee8c2a595c2dde665a4a1a27" gracePeriod=23 Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.688242 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.699727 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.699772 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-plugins-conf\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.699799 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-erlang-cookie\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.699827 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cwvp\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-kube-api-access-7cwvp\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.699899 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b096325-542d-4ac6-8d16-8aa0937013b2-erlang-cookie-secret\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.699930 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-plugins\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.699952 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-server-conf\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.699981 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-tls\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.700017 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.700045 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b096325-542d-4ac6-8d16-8aa0937013b2-pod-info\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.700097 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-confd\") pod \"7b096325-542d-4ac6-8d16-8aa0937013b2\" (UID: \"7b096325-542d-4ac6-8d16-8aa0937013b2\") " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.701872 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.702239 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.702625 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.740890 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b096325-542d-4ac6-8d16-8aa0937013b2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.741853 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7b096325-542d-4ac6-8d16-8aa0937013b2-pod-info" (OuterVolumeSpecName: "pod-info") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.750376 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.757669 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.758957 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-kube-api-access-7cwvp" (OuterVolumeSpecName: "kube-api-access-7cwvp") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "kube-api-access-7cwvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.769148 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data" (OuterVolumeSpecName: "config-data") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.773610 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-server-conf" (OuterVolumeSpecName: "server-conf") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801234 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801264 4795 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801278 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801290 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cwvp\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-kube-api-access-7cwvp\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801301 4795 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7b096325-542d-4ac6-8d16-8aa0937013b2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801311 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801321 4795 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801331 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801340 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b096325-542d-4ac6-8d16-8aa0937013b2-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.801350 4795 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7b096325-542d-4ac6-8d16-8aa0937013b2-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.817924 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.819146 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7b096325-542d-4ac6-8d16-8aa0937013b2" (UID: "7b096325-542d-4ac6-8d16-8aa0937013b2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.902391 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7b096325-542d-4ac6-8d16-8aa0937013b2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.902422 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.975610 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d","Type":"ContainerDied","Data":"12b91da897daae78f76b09af510ceca04ac8909ff3967813b7e6274bf414c6a5"} Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.975641 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.975659 4795 scope.go:117] "RemoveContainer" containerID="5a6b19520891e7087129c9dfe002592444a956d213365be36d88dc721e7adc6e" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.979327 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerID="65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a" exitCode=0 Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.979392 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b096325-542d-4ac6-8d16-8aa0937013b2","Type":"ContainerDied","Data":"65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a"} Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.979417 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7b096325-542d-4ac6-8d16-8aa0937013b2","Type":"ContainerDied","Data":"43ff46f740a6f7a342639c9893e1a10e76310ef799a0ad928eb028dabd7dd840"} Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.979411 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.983549 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-w9fbs_c30e8522-2d7f-4f10-a0b4-a7cfc351d093/ovn-controller/0.log" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.983584 4795 generic.go:334] "Generic (PLEG): container finished" podID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerID="e6a343acb1888ef7b6cc47d70fee6b276aa66bddee8c2a595c2dde665a4a1a27" exitCode=139 Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.983641 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6945f64f65-rnq2b" Feb 19 21:50:03 crc kubenswrapper[4795]: I0219 21:50:03.983633 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w9fbs" event={"ID":"c30e8522-2d7f-4f10-a0b4-a7cfc351d093","Type":"ContainerDied","Data":"e6a343acb1888ef7b6cc47d70fee6b276aa66bddee8c2a595c2dde665a4a1a27"} Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.015396 4795 scope.go:117] "RemoveContainer" containerID="f5dc53fcd687359370a9224413921410e03027d27bb1e741143948af4422db6c" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.018473 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.025723 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.053274 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6945f64f65-rnq2b"] Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.061945 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6945f64f65-rnq2b"] Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.075221 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.078310 4795 scope.go:117] "RemoveContainer" containerID="65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.080539 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.105210 4795 scope.go:117] "RemoveContainer" containerID="90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.127859 4795 scope.go:117] "RemoveContainer" containerID="65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a" Feb 19 21:50:04 crc kubenswrapper[4795]: E0219 21:50:04.128527 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a\": container with ID starting with 65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a not found: ID does not exist" containerID="65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.128579 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a"} err="failed to get container status \"65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a\": rpc error: code = NotFound desc = could not find container \"65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a\": container with ID starting with 65d80d81718a8cacc6c1500df8f269c37262bad204c9b2124ccbee151d27a66a not found: ID does not exist" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.128607 4795 scope.go:117] "RemoveContainer" containerID="90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55" Feb 19 21:50:04 crc kubenswrapper[4795]: E0219 21:50:04.128873 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55\": container with ID starting with 90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55 not found: ID does not exist" containerID="90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.128917 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55"} err="failed to get container status \"90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55\": rpc error: code = NotFound desc = could not find container \"90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55\": container with ID starting with 90e58bd420672619d861e1841541403fdbfcac1af5a896cc071e58d6af65cf55 not found: ID does not exist" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.600664 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-w9fbs_c30e8522-2d7f-4f10-a0b4-a7cfc351d093/ovn-controller/0.log" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.600744 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w9fbs" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.613953 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkzbj\" (UniqueName: \"kubernetes.io/projected/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-kube-api-access-bkzbj\") pod \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.613995 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-combined-ca-bundle\") pod \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.614034 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-scripts\") pod \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.614076 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-ovn-controller-tls-certs\") pod \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.614094 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run\") pod \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.614143 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run-ovn\") pod \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.614159 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-log-ovn\") pod \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\" (UID: \"c30e8522-2d7f-4f10-a0b4-a7cfc351d093\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.614368 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c30e8522-2d7f-4f10-a0b4-a7cfc351d093" (UID: "c30e8522-2d7f-4f10-a0b4-a7cfc351d093"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.615153 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-scripts" (OuterVolumeSpecName: "scripts") pod "c30e8522-2d7f-4f10-a0b4-a7cfc351d093" (UID: "c30e8522-2d7f-4f10-a0b4-a7cfc351d093"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.615230 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run" (OuterVolumeSpecName: "var-run") pod "c30e8522-2d7f-4f10-a0b4-a7cfc351d093" (UID: "c30e8522-2d7f-4f10-a0b4-a7cfc351d093"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.615252 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c30e8522-2d7f-4f10-a0b4-a7cfc351d093" (UID: "c30e8522-2d7f-4f10-a0b4-a7cfc351d093"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.620740 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-kube-api-access-bkzbj" (OuterVolumeSpecName: "kube-api-access-bkzbj") pod "c30e8522-2d7f-4f10-a0b4-a7cfc351d093" (UID: "c30e8522-2d7f-4f10-a0b4-a7cfc351d093"). InnerVolumeSpecName "kube-api-access-bkzbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.644423 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c30e8522-2d7f-4f10-a0b4-a7cfc351d093" (UID: "c30e8522-2d7f-4f10-a0b4-a7cfc351d093"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.698908 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "c30e8522-2d7f-4f10-a0b4-a7cfc351d093" (UID: "c30e8522-2d7f-4f10-a0b4-a7cfc351d093"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.705889 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.715550 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.715600 4795 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.715610 4795 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.715622 4795 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.715630 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkzbj\" (UniqueName: \"kubernetes.io/projected/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-kube-api-access-bkzbj\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.715638 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.715646 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c30e8522-2d7f-4f10-a0b4-a7cfc351d093-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.816274 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-config-data\") pod \"e956453d-551f-44b4-8125-8656b3155402\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.816371 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-sg-core-conf-yaml\") pod \"e956453d-551f-44b4-8125-8656b3155402\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.816409 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-scripts\") pod \"e956453d-551f-44b4-8125-8656b3155402\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.816441 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-combined-ca-bundle\") pod \"e956453d-551f-44b4-8125-8656b3155402\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.816470 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-ceilometer-tls-certs\") pod \"e956453d-551f-44b4-8125-8656b3155402\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.816510 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-829s7\" (UniqueName: \"kubernetes.io/projected/e956453d-551f-44b4-8125-8656b3155402-kube-api-access-829s7\") pod \"e956453d-551f-44b4-8125-8656b3155402\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.816546 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-run-httpd\") pod \"e956453d-551f-44b4-8125-8656b3155402\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.816609 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-log-httpd\") pod \"e956453d-551f-44b4-8125-8656b3155402\" (UID: \"e956453d-551f-44b4-8125-8656b3155402\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.817895 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e956453d-551f-44b4-8125-8656b3155402" (UID: "e956453d-551f-44b4-8125-8656b3155402"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.818045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e956453d-551f-44b4-8125-8656b3155402" (UID: "e956453d-551f-44b4-8125-8656b3155402"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.820678 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-scripts" (OuterVolumeSpecName: "scripts") pod "e956453d-551f-44b4-8125-8656b3155402" (UID: "e956453d-551f-44b4-8125-8656b3155402"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.823613 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e956453d-551f-44b4-8125-8656b3155402-kube-api-access-829s7" (OuterVolumeSpecName: "kube-api-access-829s7") pod "e956453d-551f-44b4-8125-8656b3155402" (UID: "e956453d-551f-44b4-8125-8656b3155402"). InnerVolumeSpecName "kube-api-access-829s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.840799 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e956453d-551f-44b4-8125-8656b3155402" (UID: "e956453d-551f-44b4-8125-8656b3155402"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.860085 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e956453d-551f-44b4-8125-8656b3155402" (UID: "e956453d-551f-44b4-8125-8656b3155402"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.889297 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e956453d-551f-44b4-8125-8656b3155402" (UID: "e956453d-551f-44b4-8125-8656b3155402"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.896483 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-config-data" (OuterVolumeSpecName: "config-data") pod "e956453d-551f-44b4-8125-8656b3155402" (UID: "e956453d-551f-44b4-8125-8656b3155402"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.907544 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.917605 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-combined-ca-bundle\") pod \"f2710b23-7a5c-44cb-b916-9e08edc59636\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.917712 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5wq7\" (UniqueName: \"kubernetes.io/projected/f2710b23-7a5c-44cb-b916-9e08edc59636-kube-api-access-t5wq7\") pod \"f2710b23-7a5c-44cb-b916-9e08edc59636\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.917789 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-config-data\") pod \"f2710b23-7a5c-44cb-b916-9e08edc59636\" (UID: \"f2710b23-7a5c-44cb-b916-9e08edc59636\") " Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.917999 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.918013 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-829s7\" (UniqueName: \"kubernetes.io/projected/e956453d-551f-44b4-8125-8656b3155402-kube-api-access-829s7\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.918025 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.918035 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e956453d-551f-44b4-8125-8656b3155402-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.918044 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.918054 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.918063 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.918076 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e956453d-551f-44b4-8125-8656b3155402-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.948803 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2710b23-7a5c-44cb-b916-9e08edc59636" (UID: "f2710b23-7a5c-44cb-b916-9e08edc59636"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.949409 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2710b23-7a5c-44cb-b916-9e08edc59636-kube-api-access-t5wq7" (OuterVolumeSpecName: "kube-api-access-t5wq7") pod "f2710b23-7a5c-44cb-b916-9e08edc59636" (UID: "f2710b23-7a5c-44cb-b916-9e08edc59636"). InnerVolumeSpecName "kube-api-access-t5wq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.958929 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-config-data" (OuterVolumeSpecName: "config-data") pod "f2710b23-7a5c-44cb-b916-9e08edc59636" (UID: "f2710b23-7a5c-44cb-b916-9e08edc59636"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.992370 4795 generic.go:334] "Generic (PLEG): container finished" podID="f2710b23-7a5c-44cb-b916-9e08edc59636" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" exitCode=0 Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.992457 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.992472 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f2710b23-7a5c-44cb-b916-9e08edc59636","Type":"ContainerDied","Data":"f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6"} Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.992502 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f2710b23-7a5c-44cb-b916-9e08edc59636","Type":"ContainerDied","Data":"18a795e7f80bb780eadeb9ae01b9659d15da8639c51f358e1baf726a07014084"} Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.992517 4795 scope.go:117] "RemoveContainer" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.997661 4795 generic.go:334] "Generic (PLEG): container finished" podID="e956453d-551f-44b4-8125-8656b3155402" containerID="98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734" exitCode=0 Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.997699 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerDied","Data":"98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734"} Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.997720 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 21:50:04 crc kubenswrapper[4795]: I0219 21:50:04.997731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e956453d-551f-44b4-8125-8656b3155402","Type":"ContainerDied","Data":"9d09fb4d826d8602127fabff658a8440e51f38b0c8a942f510e29c6808527ef7"} Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.001509 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-w9fbs_c30e8522-2d7f-4f10-a0b4-a7cfc351d093/ovn-controller/0.log" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.001621 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-w9fbs" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.001731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-w9fbs" event={"ID":"c30e8522-2d7f-4f10-a0b4-a7cfc351d093","Type":"ContainerDied","Data":"dc18420d588bd541d274269ae096f1224bb6a914c81107d9d0d3602a4e7a25d2"} Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.019291 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.019319 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2710b23-7a5c-44cb-b916-9e08edc59636-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.019329 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5wq7\" (UniqueName: \"kubernetes.io/projected/f2710b23-7a5c-44cb-b916-9e08edc59636-kube-api-access-t5wq7\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.028138 4795 scope.go:117] "RemoveContainer" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" Feb 19 21:50:05 crc kubenswrapper[4795]: E0219 21:50:05.028535 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6\": container with ID starting with f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6 not found: ID does not exist" containerID="f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.028567 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6"} err="failed to get container status \"f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6\": rpc error: code = NotFound desc = could not find container \"f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6\": container with ID starting with f6bd473cfa8f83b5b9a223a373e83766b77d9b66d6e769678f44f8ca837142b6 not found: ID does not exist" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.028585 4795 scope.go:117] "RemoveContainer" containerID="c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.038445 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.052446 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.059722 4795 scope.go:117] "RemoveContainer" containerID="21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.060836 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.066360 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.071081 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-w9fbs"] Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.075347 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-w9fbs"] Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.076270 4795 scope.go:117] "RemoveContainer" containerID="98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.093715 4795 scope.go:117] "RemoveContainer" containerID="a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.113948 4795 scope.go:117] "RemoveContainer" containerID="c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26" Feb 19 21:50:05 crc kubenswrapper[4795]: E0219 21:50:05.114547 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26\": container with ID starting with c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26 not found: ID does not exist" containerID="c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.114575 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26"} err="failed to get container status \"c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26\": rpc error: code = NotFound desc = could not find container \"c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26\": container with ID starting with c677d54567313f232f88b9a873fec54a2577075c7b806ea88ad32df8cb36cb26 not found: ID does not exist" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.114599 4795 scope.go:117] "RemoveContainer" containerID="21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb" Feb 19 21:50:05 crc kubenswrapper[4795]: E0219 21:50:05.115446 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb\": container with ID starting with 21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb not found: ID does not exist" containerID="21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.115477 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb"} err="failed to get container status \"21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb\": rpc error: code = NotFound desc = could not find container \"21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb\": container with ID starting with 21486f4b00e940629fe56d483d784412d9ef80990cf1527d454dfa0b33b31ddb not found: ID does not exist" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.115497 4795 scope.go:117] "RemoveContainer" containerID="98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734" Feb 19 21:50:05 crc kubenswrapper[4795]: E0219 21:50:05.115720 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734\": container with ID starting with 98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734 not found: ID does not exist" containerID="98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.115741 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734"} err="failed to get container status \"98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734\": rpc error: code = NotFound desc = could not find container \"98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734\": container with ID starting with 98c35a6c89e932ad9cd3cce6a6d12a0653f350e3b0c1cefa51b67ce7752f0734 not found: ID does not exist" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.115754 4795 scope.go:117] "RemoveContainer" containerID="a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552" Feb 19 21:50:05 crc kubenswrapper[4795]: E0219 21:50:05.116133 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552\": container with ID starting with a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552 not found: ID does not exist" containerID="a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.116241 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552"} err="failed to get container status \"a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552\": rpc error: code = NotFound desc = could not find container \"a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552\": container with ID starting with a502298e81b556f379f6fb3a9a80c1cf56ac3cccd10f1bd463a7436d04a56552 not found: ID does not exist" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.116311 4795 scope.go:117] "RemoveContainer" containerID="e6a343acb1888ef7b6cc47d70fee6b276aa66bddee8c2a595c2dde665a4a1a27" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.520748 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b928260-ac65-479d-bd4b-f14b48d24ddb" path="/var/lib/kubelet/pods/4b928260-ac65-479d-bd4b-f14b48d24ddb/volumes" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.521674 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" path="/var/lib/kubelet/pods/7b096325-542d-4ac6-8d16-8aa0937013b2/volumes" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.522654 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" path="/var/lib/kubelet/pods/ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d/volumes" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.524026 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" path="/var/lib/kubelet/pods/c30e8522-2d7f-4f10-a0b4-a7cfc351d093/volumes" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.524760 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e956453d-551f-44b4-8125-8656b3155402" path="/var/lib/kubelet/pods/e956453d-551f-44b4-8125-8656b3155402/volumes" Feb 19 21:50:05 crc kubenswrapper[4795]: I0219 21:50:05.526235 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2710b23-7a5c-44cb-b916-9e08edc59636" path="/var/lib/kubelet/pods/f2710b23-7a5c-44cb-b916-9e08edc59636/volumes" Feb 19 21:50:07 crc kubenswrapper[4795]: E0219 21:50:07.628550 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:07 crc kubenswrapper[4795]: E0219 21:50:07.629532 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:07 crc kubenswrapper[4795]: E0219 21:50:07.629831 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:07 crc kubenswrapper[4795]: E0219 21:50:07.629870 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:50:07 crc kubenswrapper[4795]: E0219 21:50:07.630419 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:07 crc kubenswrapper[4795]: E0219 21:50:07.632659 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:07 crc kubenswrapper[4795]: E0219 21:50:07.636857 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:07 crc kubenswrapper[4795]: E0219 21:50:07.636956 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:50:10 crc kubenswrapper[4795]: I0219 21:50:10.024447 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-576c65f985-r97z7" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.162:9696/\": dial tcp 10.217.0.162:9696: connect: connection refused" Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.829746 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.937752 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dln52\" (UniqueName: \"kubernetes.io/projected/30f2c894-7a7a-4e5a-a090-a28ab50c766a-kube-api-access-dln52\") pod \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.937862 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-public-tls-certs\") pod \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.937998 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-combined-ca-bundle\") pod \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.938056 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-internal-tls-certs\") pod \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.938138 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-ovndb-tls-certs\") pod \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.938196 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-httpd-config\") pod \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.938244 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-config\") pod \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\" (UID: \"30f2c894-7a7a-4e5a-a090-a28ab50c766a\") " Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.943919 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f2c894-7a7a-4e5a-a090-a28ab50c766a-kube-api-access-dln52" (OuterVolumeSpecName: "kube-api-access-dln52") pod "30f2c894-7a7a-4e5a-a090-a28ab50c766a" (UID: "30f2c894-7a7a-4e5a-a090-a28ab50c766a"). InnerVolumeSpecName "kube-api-access-dln52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.944347 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "30f2c894-7a7a-4e5a-a090-a28ab50c766a" (UID: "30f2c894-7a7a-4e5a-a090-a28ab50c766a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.976304 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30f2c894-7a7a-4e5a-a090-a28ab50c766a" (UID: "30f2c894-7a7a-4e5a-a090-a28ab50c766a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.979707 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "30f2c894-7a7a-4e5a-a090-a28ab50c766a" (UID: "30f2c894-7a7a-4e5a-a090-a28ab50c766a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.986361 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "30f2c894-7a7a-4e5a-a090-a28ab50c766a" (UID: "30f2c894-7a7a-4e5a-a090-a28ab50c766a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:11 crc kubenswrapper[4795]: I0219 21:50:11.994053 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "30f2c894-7a7a-4e5a-a090-a28ab50c766a" (UID: "30f2c894-7a7a-4e5a-a090-a28ab50c766a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.011359 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-config" (OuterVolumeSpecName: "config") pod "30f2c894-7a7a-4e5a-a090-a28ab50c766a" (UID: "30f2c894-7a7a-4e5a-a090-a28ab50c766a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.040113 4795 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.040386 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.040442 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-config\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.040506 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dln52\" (UniqueName: \"kubernetes.io/projected/30f2c894-7a7a-4e5a-a090-a28ab50c766a-kube-api-access-dln52\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.040559 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.040608 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.040656 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30f2c894-7a7a-4e5a-a090-a28ab50c766a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.090762 4795 generic.go:334] "Generic (PLEG): container finished" podID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerID="b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0" exitCode=0 Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.090831 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576c65f985-r97z7" event={"ID":"30f2c894-7a7a-4e5a-a090-a28ab50c766a","Type":"ContainerDied","Data":"b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0"} Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.090899 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-576c65f985-r97z7" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.091216 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-576c65f985-r97z7" event={"ID":"30f2c894-7a7a-4e5a-a090-a28ab50c766a","Type":"ContainerDied","Data":"cc411b717439dc2d51f309775cfcf3728048016bc68869b8b28221a90840d6fb"} Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.091266 4795 scope.go:117] "RemoveContainer" containerID="fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.113983 4795 scope.go:117] "RemoveContainer" containerID="b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.129500 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-576c65f985-r97z7"] Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.133975 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-576c65f985-r97z7"] Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.152489 4795 scope.go:117] "RemoveContainer" containerID="fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec" Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.153144 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec\": container with ID starting with fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec not found: ID does not exist" containerID="fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.153204 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec"} err="failed to get container status \"fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec\": rpc error: code = NotFound desc = could not find container \"fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec\": container with ID starting with fe9cf6c1b6f422c771781be2cfb21deabc54071e241ab6b5c24d3d83414ee4ec not found: ID does not exist" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.153235 4795 scope.go:117] "RemoveContainer" containerID="b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0" Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.153884 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0\": container with ID starting with b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0 not found: ID does not exist" containerID="b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0" Feb 19 21:50:12 crc kubenswrapper[4795]: I0219 21:50:12.153958 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0"} err="failed to get container status \"b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0\": rpc error: code = NotFound desc = could not find container \"b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0\": container with ID starting with b4418cb3b4a933e20130924748ce8ea933d7929e7a32c146e7e7ce593a7c13a0 not found: ID does not exist" Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.628667 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.632446 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.635716 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.637732 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.637786 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.632979 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.642095 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:12 crc kubenswrapper[4795]: E0219 21:50:12.642389 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:50:13 crc kubenswrapper[4795]: I0219 21:50:13.528957 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" path="/var/lib/kubelet/pods/30f2c894-7a7a-4e5a-a090-a28ab50c766a/volumes" Feb 19 21:50:17 crc kubenswrapper[4795]: E0219 21:50:17.630563 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:17 crc kubenswrapper[4795]: E0219 21:50:17.632267 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:17 crc kubenswrapper[4795]: E0219 21:50:17.632334 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:17 crc kubenswrapper[4795]: E0219 21:50:17.638125 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:17 crc kubenswrapper[4795]: E0219 21:50:17.638225 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:50:17 crc kubenswrapper[4795]: E0219 21:50:17.639413 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:17 crc kubenswrapper[4795]: E0219 21:50:17.641115 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:17 crc kubenswrapper[4795]: E0219 21:50:17.641215 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:50:22 crc kubenswrapper[4795]: E0219 21:50:22.628017 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:22 crc kubenswrapper[4795]: E0219 21:50:22.629273 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:22 crc kubenswrapper[4795]: E0219 21:50:22.630006 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:22 crc kubenswrapper[4795]: E0219 21:50:22.630725 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 19 21:50:22 crc kubenswrapper[4795]: E0219 21:50:22.630795 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:50:22 crc kubenswrapper[4795]: E0219 21:50:22.632997 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:22 crc kubenswrapper[4795]: E0219 21:50:22.635240 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 19 21:50:22 crc kubenswrapper[4795]: E0219 21:50:22.635309 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tl5hf" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.249061 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerID="955740cbd5ac4eda735378957980240001d5c0ce0905f2fca18b4155f3fb6c98" exitCode=137 Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.249180 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"955740cbd5ac4eda735378957980240001d5c0ce0905f2fca18b4155f3fb6c98"} Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.511434 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.668872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-cache\") pod \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.668941 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-lock\") pod \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.668968 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.669019 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") pod \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.669037 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bh4z\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-kube-api-access-7bh4z\") pod \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.669062 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c773ec2-a400-42a9-8784-ed9c295c3bb4-combined-ca-bundle\") pod \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\" (UID: \"6c773ec2-a400-42a9-8784-ed9c295c3bb4\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.670113 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-cache" (OuterVolumeSpecName: "cache") pod "6c773ec2-a400-42a9-8784-ed9c295c3bb4" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.670480 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-lock" (OuterVolumeSpecName: "lock") pod "6c773ec2-a400-42a9-8784-ed9c295c3bb4" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.675750 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "6c773ec2-a400-42a9-8784-ed9c295c3bb4" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.676133 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-kube-api-access-7bh4z" (OuterVolumeSpecName: "kube-api-access-7bh4z") pod "6c773ec2-a400-42a9-8784-ed9c295c3bb4" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4"). InnerVolumeSpecName "kube-api-access-7bh4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.676437 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6c773ec2-a400-42a9-8784-ed9c295c3bb4" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.719048 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.770565 4795 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-cache\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.770624 4795 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6c773ec2-a400-42a9-8784-ed9c295c3bb4-lock\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.770653 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.770668 4795 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.770680 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bh4z\" (UniqueName: \"kubernetes.io/projected/6c773ec2-a400-42a9-8784-ed9c295c3bb4-kube-api-access-7bh4z\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.788911 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.871305 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data-custom\") pod \"c54f77a4-1095-4ff1-bc74-b845cde659d9\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.871461 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-scripts\") pod \"c54f77a4-1095-4ff1-bc74-b845cde659d9\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.871517 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data\") pod \"c54f77a4-1095-4ff1-bc74-b845cde659d9\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.871611 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4bdc\" (UniqueName: \"kubernetes.io/projected/c54f77a4-1095-4ff1-bc74-b845cde659d9-kube-api-access-b4bdc\") pod \"c54f77a4-1095-4ff1-bc74-b845cde659d9\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.871647 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c54f77a4-1095-4ff1-bc74-b845cde659d9-etc-machine-id\") pod \"c54f77a4-1095-4ff1-bc74-b845cde659d9\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.871704 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-combined-ca-bundle\") pod \"c54f77a4-1095-4ff1-bc74-b845cde659d9\" (UID: \"c54f77a4-1095-4ff1-bc74-b845cde659d9\") " Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.872215 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c54f77a4-1095-4ff1-bc74-b845cde659d9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c54f77a4-1095-4ff1-bc74-b845cde659d9" (UID: "c54f77a4-1095-4ff1-bc74-b845cde659d9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.875498 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-scripts" (OuterVolumeSpecName: "scripts") pod "c54f77a4-1095-4ff1-bc74-b845cde659d9" (UID: "c54f77a4-1095-4ff1-bc74-b845cde659d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.881521 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.881563 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.881574 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c54f77a4-1095-4ff1-bc74-b845cde659d9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.885411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c54f77a4-1095-4ff1-bc74-b845cde659d9-kube-api-access-b4bdc" (OuterVolumeSpecName: "kube-api-access-b4bdc") pod "c54f77a4-1095-4ff1-bc74-b845cde659d9" (UID: "c54f77a4-1095-4ff1-bc74-b845cde659d9"). InnerVolumeSpecName "kube-api-access-b4bdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.900354 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c54f77a4-1095-4ff1-bc74-b845cde659d9" (UID: "c54f77a4-1095-4ff1-bc74-b845cde659d9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.950243 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c54f77a4-1095-4ff1-bc74-b845cde659d9" (UID: "c54f77a4-1095-4ff1-bc74-b845cde659d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.977465 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data" (OuterVolumeSpecName: "config-data") pod "c54f77a4-1095-4ff1-bc74-b845cde659d9" (UID: "c54f77a4-1095-4ff1-bc74-b845cde659d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.983691 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4bdc\" (UniqueName: \"kubernetes.io/projected/c54f77a4-1095-4ff1-bc74-b845cde659d9-kube-api-access-b4bdc\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.983713 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.983724 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.983735 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c54f77a4-1095-4ff1-bc74-b845cde659d9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:26 crc kubenswrapper[4795]: I0219 21:50:26.998409 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c773ec2-a400-42a9-8784-ed9c295c3bb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c773ec2-a400-42a9-8784-ed9c295c3bb4" (UID: "6c773ec2-a400-42a9-8784-ed9c295c3bb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.078402 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tl5hf_9a19676c-9314-43a3-a2f8-bcf56d6b5ce3/ovs-vswitchd/0.log" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.079350 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.085615 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c773ec2-a400-42a9-8784-ed9c295c3bb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186200 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp572\" (UniqueName: \"kubernetes.io/projected/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-kube-api-access-rp572\") pod \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186276 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-lib\") pod \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186307 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-run\") pod \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186347 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-log\") pod \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-lib" (OuterVolumeSpecName: "var-lib") pod "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" (UID: "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186429 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-scripts\") pod \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186480 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-log" (OuterVolumeSpecName: "var-log") pod "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" (UID: "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186488 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-run" (OuterVolumeSpecName: "var-run") pod "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" (UID: "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186551 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-etc-ovs\") pod \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\" (UID: \"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3\") " Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.186581 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" (UID: "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.187186 4795 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.187214 4795 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-log\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.187227 4795 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-etc-ovs\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.187239 4795 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-var-lib\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.187574 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-scripts" (OuterVolumeSpecName: "scripts") pod "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" (UID: "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.190705 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-kube-api-access-rp572" (OuterVolumeSpecName: "kube-api-access-rp572") pod "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" (UID: "9a19676c-9314-43a3-a2f8-bcf56d6b5ce3"). InnerVolumeSpecName "kube-api-access-rp572". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.266235 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6c773ec2-a400-42a9-8784-ed9c295c3bb4","Type":"ContainerDied","Data":"5f46001ee34a633ce141975121f3de2f61c9a83244b699ec2a130f6af5efbc36"} Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.266297 4795 scope.go:117] "RemoveContainer" containerID="955740cbd5ac4eda735378957980240001d5c0ce0905f2fca18b4155f3fb6c98" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.266345 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.268958 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tl5hf_9a19676c-9314-43a3-a2f8-bcf56d6b5ce3/ovs-vswitchd/0.log" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.269927 4795 generic.go:334] "Generic (PLEG): container finished" podID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" exitCode=137 Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.270029 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tl5hf" event={"ID":"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3","Type":"ContainerDied","Data":"ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf"} Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.270105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tl5hf" event={"ID":"9a19676c-9314-43a3-a2f8-bcf56d6b5ce3","Type":"ContainerDied","Data":"79efa7732f14d953bfab30c856bec07a21e8fbe6e42ebb6b0b9f4b604f334bb3"} Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.270283 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tl5hf" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.273863 4795 generic.go:334] "Generic (PLEG): container finished" podID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerID="5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8" exitCode=137 Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.274237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c54f77a4-1095-4ff1-bc74-b845cde659d9","Type":"ContainerDied","Data":"5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8"} Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.274269 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c54f77a4-1095-4ff1-bc74-b845cde659d9","Type":"ContainerDied","Data":"edafb0a067aa15654ff7c76ec822cff24dc2310ed79e9d7093d41cbc935fc540"} Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.274358 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.288191 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.288230 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp572\" (UniqueName: \"kubernetes.io/projected/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3-kube-api-access-rp572\") on node \"crc\" DevicePath \"\"" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.295062 4795 scope.go:117] "RemoveContainer" containerID="bb960073c3b2955d7aa2d18d3eb2e0958e7e98f4cd499d7077f5064d1e43a05e" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.312817 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.331029 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.333337 4795 scope.go:117] "RemoveContainer" containerID="8d3a569ab5140e595996d2c82fd170ed28aa9420de4fdae36e9b5854b2e0bd5e" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.335679 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tl5hf"] Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.343296 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-tl5hf"] Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.352810 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.359084 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.367388 4795 scope.go:117] "RemoveContainer" containerID="2ce7f7a343a6c79fd57a6b4c7cea8f6f21ccfbc5ea5261b4c592c5cc2035910e" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.388195 4795 scope.go:117] "RemoveContainer" containerID="cb0176a835c07bb843ea9834f19b5792b6d9700c5cd61a140ec8b99a66854f5f" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.407897 4795 scope.go:117] "RemoveContainer" containerID="c8d438309dbe4ee742eb9d7a2b93e755a74c9ba2dd39409dcf7caf84dee6405a" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.425745 4795 scope.go:117] "RemoveContainer" containerID="c5d0640985105b2140d43ecf956f5621f6e82eb5a3d40d95fb3d09d303406c84" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.451069 4795 scope.go:117] "RemoveContainer" containerID="f2b55f40d9ab92e06fdc09f65e72764f7f9c63fbb1f126ede2058624236d001f" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.472942 4795 scope.go:117] "RemoveContainer" containerID="5019fca913d092cd1d004e058553c364ac08007bafeae027b681e3bb6eb59026" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.492884 4795 scope.go:117] "RemoveContainer" containerID="be1c394688c447ed772b4929317159025a1e97491b40b847644ed369351532b5" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.513374 4795 scope.go:117] "RemoveContainer" containerID="a86db1a02ddab5086097179b35e6f17d71c32f36157625ee70912b23839603d4" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.525154 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" path="/var/lib/kubelet/pods/6c773ec2-a400-42a9-8784-ed9c295c3bb4/volumes" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.529020 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" path="/var/lib/kubelet/pods/9a19676c-9314-43a3-a2f8-bcf56d6b5ce3/volumes" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.530244 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" path="/var/lib/kubelet/pods/c54f77a4-1095-4ff1-bc74-b845cde659d9/volumes" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.533856 4795 scope.go:117] "RemoveContainer" containerID="d2281de4777acfa86c800d06ed4c2e0ac8613cf4008b8449cd7089d057ee51ec" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.555485 4795 scope.go:117] "RemoveContainer" containerID="22441008e17545864d7c6366d4ab4fa8333a1c04e36b9961e8fdfdfaeec8b1b6" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.574812 4795 scope.go:117] "RemoveContainer" containerID="b4b0474dd3a4192273fa0b2dc273a792e955cd0dde33e24a1afa65bb56656eaa" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.592496 4795 scope.go:117] "RemoveContainer" containerID="51c55baa52f08bcb95276c0f7a67a7ef348b9bd02a9fc401f50e679f37e0c117" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.611206 4795 scope.go:117] "RemoveContainer" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.637865 4795 scope.go:117] "RemoveContainer" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.666456 4795 scope.go:117] "RemoveContainer" containerID="000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.702721 4795 scope.go:117] "RemoveContainer" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" Feb 19 21:50:27 crc kubenswrapper[4795]: E0219 21:50:27.703475 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf\": container with ID starting with ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf not found: ID does not exist" containerID="ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.703543 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf"} err="failed to get container status \"ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf\": rpc error: code = NotFound desc = could not find container \"ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf\": container with ID starting with ebc73fca22ccd7aa64a424b1eb2e6d8556b809c91bf36bb6c2bbe679d1c5f3bf not found: ID does not exist" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.703582 4795 scope.go:117] "RemoveContainer" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" Feb 19 21:50:27 crc kubenswrapper[4795]: E0219 21:50:27.704102 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23\": container with ID starting with e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 not found: ID does not exist" containerID="e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.704234 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23"} err="failed to get container status \"e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23\": rpc error: code = NotFound desc = could not find container \"e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23\": container with ID starting with e774980b86800956d8b51a9102b686bb1dd5fb0880204942df0776d88ce26c23 not found: ID does not exist" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.704283 4795 scope.go:117] "RemoveContainer" containerID="000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde" Feb 19 21:50:27 crc kubenswrapper[4795]: E0219 21:50:27.705534 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde\": container with ID starting with 000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde not found: ID does not exist" containerID="000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.705578 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde"} err="failed to get container status \"000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde\": rpc error: code = NotFound desc = could not find container \"000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde\": container with ID starting with 000996048fc152f2f7ff89617b797c3b6c97a3bedd94f9b3f95b461725418fde not found: ID does not exist" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.705604 4795 scope.go:117] "RemoveContainer" containerID="b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.745717 4795 scope.go:117] "RemoveContainer" containerID="5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.767427 4795 scope.go:117] "RemoveContainer" containerID="b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1" Feb 19 21:50:27 crc kubenswrapper[4795]: E0219 21:50:27.767956 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1\": container with ID starting with b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1 not found: ID does not exist" containerID="b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.768001 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1"} err="failed to get container status \"b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1\": rpc error: code = NotFound desc = could not find container \"b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1\": container with ID starting with b84203bdd91f64f0aed89ae5d8334985c117eb72cc298fbd0865d60f726218c1 not found: ID does not exist" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.768027 4795 scope.go:117] "RemoveContainer" containerID="5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8" Feb 19 21:50:27 crc kubenswrapper[4795]: E0219 21:50:27.768440 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8\": container with ID starting with 5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8 not found: ID does not exist" containerID="5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8" Feb 19 21:50:27 crc kubenswrapper[4795]: I0219 21:50:27.768470 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8"} err="failed to get container status \"5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8\": rpc error: code = NotFound desc = could not find container \"5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8\": container with ID starting with 5256ff7cce2d8a61bc1b79dbf58379aff2f08fd19479fafdd317965f24177cc8 not found: ID does not exist" Feb 19 21:50:28 crc kubenswrapper[4795]: I0219 21:50:28.427719 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:50:28 crc kubenswrapper[4795]: I0219 21:50:28.427763 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:50:28 crc kubenswrapper[4795]: I0219 21:50:28.427799 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:50:28 crc kubenswrapper[4795]: I0219 21:50:28.428313 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d1fe8d148e55484c7e32b6632ef0256602ab969af6bc815e1058a95087794811"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:50:28 crc kubenswrapper[4795]: I0219 21:50:28.428357 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://d1fe8d148e55484c7e32b6632ef0256602ab969af6bc815e1058a95087794811" gracePeriod=600 Feb 19 21:50:29 crc kubenswrapper[4795]: I0219 21:50:29.303928 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="d1fe8d148e55484c7e32b6632ef0256602ab969af6bc815e1058a95087794811" exitCode=0 Feb 19 21:50:29 crc kubenswrapper[4795]: I0219 21:50:29.304001 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"d1fe8d148e55484c7e32b6632ef0256602ab969af6bc815e1058a95087794811"} Feb 19 21:50:29 crc kubenswrapper[4795]: I0219 21:50:29.304359 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8"} Feb 19 21:50:29 crc kubenswrapper[4795]: I0219 21:50:29.304381 4795 scope.go:117] "RemoveContainer" containerID="26a524895f2f97a5543d7713a3a6fad00cc54e588f3f27507aef436c0255d593" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.118093 4795 scope.go:117] "RemoveContainer" containerID="088f63c71340a869532178f76fc11ab9c0cbbdc5c1aeacd6e851cf5d40a46fa0" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.175441 4795 scope.go:117] "RemoveContainer" containerID="6b0c6da228cd7d133c73866bc31005f08b881ce2ffda7a1dc8905fe2cbf580b7" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.198598 4795 scope.go:117] "RemoveContainer" containerID="9ae6b68ead4d625967dc9f8e67a83bcadf072cd6d6b5d617f89d10bb543678b8" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.252100 4795 scope.go:117] "RemoveContainer" containerID="83bc6dd6efe6f9962ae47d6fb7eb26b3205715ecf2651537919f42966aa1ce32" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.273188 4795 scope.go:117] "RemoveContainer" containerID="d90138519ffcbc0102f0a7e9dc5bfa3f8f09f718ea4e3fd3a5f7e537cfb53122" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.291538 4795 scope.go:117] "RemoveContainer" containerID="464384efe0cd85358add102341feabf48351464a71bfc2ad9ed2ce3ea45a4a94" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.323258 4795 scope.go:117] "RemoveContainer" containerID="c3d76989d78df6e0877ea687626ca27c833a6485113f8ba0b36de864c9998267" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.340866 4795 scope.go:117] "RemoveContainer" containerID="83550b0e412e3c0f2147b2389e1c3220efde13a4057f9e67e612e0461d3172fc" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.357932 4795 scope.go:117] "RemoveContainer" containerID="b0effc166b07beb1020b33c2901a97075e3341db6cb3398e72144f393ccb6850" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.399657 4795 scope.go:117] "RemoveContainer" containerID="122378ef931bbdce5fbd1c31b593170eeae5e5fed8a74b40fbb0d536de3a3d22" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.454698 4795 scope.go:117] "RemoveContainer" containerID="27f5e555e30e338ab9e5ae7facd6ba963cf6336bbb4bec245c099c4afa8bb6a3" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.484514 4795 scope.go:117] "RemoveContainer" containerID="f13d7ab12cdcba95909b27dcd1c1e77cc3443dbab3be6042ac8015c2168e9280" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.531376 4795 scope.go:117] "RemoveContainer" containerID="00ef13a6fc1812f0a18c09ccfd124e49c733a0a626dc7ae23746169010b14516" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.554165 4795 scope.go:117] "RemoveContainer" containerID="88bfd25384ba94c2658b1d015dafd73e32d146937b8caf9b1891261850b11f22" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.567848 4795 scope.go:117] "RemoveContainer" containerID="7c32d427d23010f8ec7644388fc5c6ced9ee00bbf0a6575354abad70dc15498d" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.593046 4795 scope.go:117] "RemoveContainer" containerID="2c21dc2092bc9760ef50273deb040b6251a319d32c5e2c0fdd3bf1678ba55094" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.613320 4795 scope.go:117] "RemoveContainer" containerID="62e8bfd862eff78d929edc90f98ab271d6cbd53192119320b7d77f0dddb03767" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.633978 4795 scope.go:117] "RemoveContainer" containerID="95c63f1640980c95f161952724dcddb5ac630545c512a2aa1ea3882e47e48df9" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.649399 4795 scope.go:117] "RemoveContainer" containerID="f3e360ac25dc639935c61b2baa9937c7f07b88ebec95aede182ab25a4907955c" Feb 19 21:52:12 crc kubenswrapper[4795]: I0219 21:52:12.676066 4795 scope.go:117] "RemoveContainer" containerID="0c92f3d0df6fb4ffa1967f7b15d462ab0538b07666cad5a1fe2cc6d118293fe8" Feb 19 21:52:28 crc kubenswrapper[4795]: I0219 21:52:28.427394 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:52:28 crc kubenswrapper[4795]: I0219 21:52:28.427927 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:52:58 crc kubenswrapper[4795]: I0219 21:52:58.427459 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:52:58 crc kubenswrapper[4795]: I0219 21:52:58.428054 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:53:12 crc kubenswrapper[4795]: I0219 21:53:12.930212 4795 scope.go:117] "RemoveContainer" containerID="76c63f1533e7c3fa5624057c7772474f1729c45e568e970f6a98d3e337ab74ae" Feb 19 21:53:12 crc kubenswrapper[4795]: I0219 21:53:12.969142 4795 scope.go:117] "RemoveContainer" containerID="0852bee2926e9dad886249433c004e0f4241817093b65fb2480708d4c6c502c0" Feb 19 21:53:13 crc kubenswrapper[4795]: I0219 21:53:13.013027 4795 scope.go:117] "RemoveContainer" containerID="a55b1033c9d990cf35d1c92321ef2ffff86d5c602754349685623befa9e70257" Feb 19 21:53:13 crc kubenswrapper[4795]: I0219 21:53:13.045156 4795 scope.go:117] "RemoveContainer" containerID="da621ffa210adf5451b6eb9d78a9cbed2608c7b5e2b1a9ce00eba1c1e65c6ec4" Feb 19 21:53:13 crc kubenswrapper[4795]: I0219 21:53:13.093756 4795 scope.go:117] "RemoveContainer" containerID="77a5c881bfb4b3162733203d6d06eed351c1cde8f2967461288b978f94eeb5ba" Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.427445 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.428036 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.428098 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.428821 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.428893 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" gracePeriod=600 Feb 19 21:53:28 crc kubenswrapper[4795]: E0219 21:53:28.557104 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.845233 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" exitCode=0 Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.845322 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8"} Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.845421 4795 scope.go:117] "RemoveContainer" containerID="d1fe8d148e55484c7e32b6632ef0256602ab969af6bc815e1058a95087794811" Feb 19 21:53:28 crc kubenswrapper[4795]: I0219 21:53:28.845954 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:53:28 crc kubenswrapper[4795]: E0219 21:53:28.846151 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:53:42 crc kubenswrapper[4795]: I0219 21:53:42.511259 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:53:42 crc kubenswrapper[4795]: E0219 21:53:42.511994 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.031510 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-crhfj"] Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.031955 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.031983 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032009 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="ovn-northd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032021 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="ovn-northd" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032042 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032055 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032076 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-expirer" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032102 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-expirer" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032125 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="swift-recon-cron" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032137 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="swift-recon-cron" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032157 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerName="glance-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032192 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerName="glance-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032222 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerName="setup-container" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032233 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerName="setup-container" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032254 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerName="placement-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032277 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerName="placement-api" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032297 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="openstack-network-exporter" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032311 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="openstack-network-exporter" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032326 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerName="glance-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032338 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerName="glance-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032364 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerName="rabbitmq" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032377 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerName="rabbitmq" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032395 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" containerName="memcached" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032407 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" containerName="memcached" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032422 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032434 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032456 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032468 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032482 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032493 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032514 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032527 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032550 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032562 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-server" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032584 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerName="cinder-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032595 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerName="cinder-api" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032613 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-reaper" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032625 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-reaper" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032638 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="rsync" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032650 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="rsync" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032665 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032678 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-api" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032698 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" containerName="mysql-bootstrap" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032710 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" containerName="mysql-bootstrap" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032730 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" containerName="galera" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032742 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" containerName="galera" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032761 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032773 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032788 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerName="setup-container" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032800 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerName="setup-container" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032813 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" containerName="mariadb-account-create-update" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032825 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" containerName="mariadb-account-create-update" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032843 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="296f6b57-de45-495d-abe9-8c779c157057" containerName="kube-state-metrics" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032857 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="296f6b57-de45-495d-abe9-8c779c157057" containerName="kube-state-metrics" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032874 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="proxy-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032887 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="proxy-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032907 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2710b23-7a5c-44cb-b916-9e08edc59636" containerName="nova-scheduler-scheduler" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032919 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2710b23-7a5c-44cb-b916-9e08edc59636" containerName="nova-scheduler-scheduler" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032935 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032947 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032960 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.032971 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-server" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.032991 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerName="glance-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033003 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerName="glance-log" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033026 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033037 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-server" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033059 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b928260-ac65-479d-bd4b-f14b48d24ddb" containerName="keystone-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033070 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b928260-ac65-479d-bd4b-f14b48d24ddb" containerName="keystone-api" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033092 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerName="cinder-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033104 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerName="cinder-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033128 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerName="proxy-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033140 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerName="proxy-server" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033153 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerName="glance-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033195 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerName="glance-log" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033212 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033224 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-log" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033244 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="ceilometer-notification-agent" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033256 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="ceilometer-notification-agent" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033279 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033293 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-api" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033316 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-updater" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033328 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-updater" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033349 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033361 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033378 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033390 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033411 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerName="probe" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033423 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerName="probe" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033442 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerName="rabbitmq" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033454 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerName="rabbitmq" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033494 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerName="proxy-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033509 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerName="proxy-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033536 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-updater" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033549 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-updater" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033569 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="sg-core" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033581 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="sg-core" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033598 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerName="placement-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033609 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerName="placement-log" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033626 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033638 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033653 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerName="cinder-scheduler" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033664 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerName="cinder-scheduler" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033685 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server-init" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033697 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server-init" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033717 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033729 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033744 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="ceilometer-central-agent" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033756 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="ceilometer-central-agent" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033772 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-metadata" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033784 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-metadata" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033808 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b83043-2f7c-4b55-a2b9-66eef96f0008" containerName="nova-cell0-conductor-conductor" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033822 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b83043-2f7c-4b55-a2b9-66eef96f0008" containerName="nova-cell0-conductor-conductor" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033838 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033849 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:53:43 crc kubenswrapper[4795]: E0219 21:53:43.033865 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" containerName="mariadb-account-create-update" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.033877 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" containerName="mariadb-account-create-update" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034115 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2710b23-7a5c-44cb-b916-9e08edc59636" containerName="nova-scheduler-scheduler" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034134 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" containerName="mariadb-account-create-update" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034151 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e81a78-17fd-4ed3-b072-dd6a20cbe5d3" containerName="memcached" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034201 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerName="cinder-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034224 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034249 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerName="proxy-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034272 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="57b83043-2f7c-4b55-a2b9-66eef96f0008" containerName="nova-cell0-conductor-conductor" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034297 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerName="placement-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034322 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034343 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034363 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034384 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerName="glance-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034398 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01bcd5b-435a-4702-b0a4-8dfe8f553c23" containerName="placement-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034411 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-expirer" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034432 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b928260-ac65-479d-bd4b-f14b48d24ddb" containerName="keystone-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034451 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034467 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovsdb-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034480 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034496 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-updater" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034514 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="da2e3f89-bf0b-4371-8e5b-a0037f266c70" containerName="proxy-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034531 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="openstack-network-exporter" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034543 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-reaper" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034565 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bbc6c00-2fc9-42cb-9c5a-9a160903ae99" containerName="galera" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034577 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerName="glance-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034591 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034607 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe2e673c-4ce5-4df9-b9e6-0b7cea99727c" containerName="ovn-northd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034619 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="account-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034634 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="proxy-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034651 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="ceilometer-notification-agent" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034664 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3697a3b0-4077-4837-bcdc-c17d8aa361f1" containerName="glance-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034678 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="ceilometer-central-agent" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034700 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c30e8522-2d7f-4f10-a0b4-a7cfc351d093" containerName="ovn-controller" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034718 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-updater" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034733 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034748 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a19676c-9314-43a3-a2f8-bcf56d6b5ce3" containerName="ovs-vswitchd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034767 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="rsync" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034786 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2561f4e-0a01-4927-96f8-ee7bef69f561" containerName="cinder-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034801 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="793bbadc-8b53-4084-a63a-0b76b37284df" containerName="nova-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034822 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e956453d-551f-44b4-8125-8656b3155402" containerName="sg-core" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034840 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab3f54ce-9f9e-46c5-9d36-f0b273e4b84d" containerName="rabbitmq" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034852 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034870 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b096325-542d-4ac6-8d16-8aa0937013b2" containerName="rabbitmq" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034885 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerName="probe" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034903 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f5adf9-b672-4dc7-8bee-a2e1c8de6cf9" containerName="glance-httpd" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034922 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-replicator" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034935 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e160dc-ca4c-45d8-ab73-5ddd1a7d2107" containerName="nova-metadata-metadata" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034950 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="object-auditor" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034966 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f2c894-7a7a-4e5a-a090-a28ab50c766a" containerName="neutron-api" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.034985 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="296f6b57-de45-495d-abe9-8c779c157057" containerName="kube-state-metrics" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.035004 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54f77a4-1095-4ff1-bc74-b845cde659d9" containerName="cinder-scheduler" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.035022 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="container-server" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.035035 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c773ec2-a400-42a9-8784-ed9c295c3bb4" containerName="swift-recon-cron" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.035050 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a6a069-904a-4072-b98c-346f67f22def" containerName="barbican-api-log" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.035625 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2164f9d1-1d8b-486b-beca-0d3a5172b302" containerName="mariadb-account-create-update" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.036823 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.064110 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-crhfj"] Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.153588 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-catalog-content\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.153720 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-utilities\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.153750 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v68jd\" (UniqueName: \"kubernetes.io/projected/39607a76-451a-4cd6-806b-c14c6a94b5ae-kube-api-access-v68jd\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.255080 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-utilities\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.255574 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v68jd\" (UniqueName: \"kubernetes.io/projected/39607a76-451a-4cd6-806b-c14c6a94b5ae-kube-api-access-v68jd\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.255715 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-catalog-content\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.255845 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-utilities\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.256500 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-catalog-content\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.288494 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v68jd\" (UniqueName: \"kubernetes.io/projected/39607a76-451a-4cd6-806b-c14c6a94b5ae-kube-api-access-v68jd\") pod \"certified-operators-crhfj\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.372353 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.871907 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-crhfj"] Feb 19 21:53:43 crc kubenswrapper[4795]: I0219 21:53:43.966758 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crhfj" event={"ID":"39607a76-451a-4cd6-806b-c14c6a94b5ae","Type":"ContainerStarted","Data":"29c905448e4c99f2240081e71d0b997093c5980cb0276302ef057ea183c10166"} Feb 19 21:53:44 crc kubenswrapper[4795]: I0219 21:53:44.975769 4795 generic.go:334] "Generic (PLEG): container finished" podID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerID="1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb" exitCode=0 Feb 19 21:53:44 crc kubenswrapper[4795]: I0219 21:53:44.975855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crhfj" event={"ID":"39607a76-451a-4cd6-806b-c14c6a94b5ae","Type":"ContainerDied","Data":"1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb"} Feb 19 21:53:44 crc kubenswrapper[4795]: I0219 21:53:44.978136 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 21:53:45 crc kubenswrapper[4795]: I0219 21:53:45.984362 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crhfj" event={"ID":"39607a76-451a-4cd6-806b-c14c6a94b5ae","Type":"ContainerStarted","Data":"8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04"} Feb 19 21:53:46 crc kubenswrapper[4795]: I0219 21:53:46.997265 4795 generic.go:334] "Generic (PLEG): container finished" podID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerID="8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04" exitCode=0 Feb 19 21:53:46 crc kubenswrapper[4795]: I0219 21:53:46.997313 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crhfj" event={"ID":"39607a76-451a-4cd6-806b-c14c6a94b5ae","Type":"ContainerDied","Data":"8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04"} Feb 19 21:53:46 crc kubenswrapper[4795]: I0219 21:53:46.997345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crhfj" event={"ID":"39607a76-451a-4cd6-806b-c14c6a94b5ae","Type":"ContainerStarted","Data":"0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841"} Feb 19 21:53:47 crc kubenswrapper[4795]: I0219 21:53:47.036627 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-crhfj" podStartSLOduration=2.606228411 podStartE2EDuration="4.036604796s" podCreationTimestamp="2026-02-19 21:53:43 +0000 UTC" firstStartedPulling="2026-02-19 21:53:44.977780752 +0000 UTC m=+1536.170298646" lastFinishedPulling="2026-02-19 21:53:46.408157137 +0000 UTC m=+1537.600675031" observedRunningTime="2026-02-19 21:53:47.03105751 +0000 UTC m=+1538.223575414" watchObservedRunningTime="2026-02-19 21:53:47.036604796 +0000 UTC m=+1538.229122670" Feb 19 21:53:53 crc kubenswrapper[4795]: I0219 21:53:53.373575 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:53 crc kubenswrapper[4795]: I0219 21:53:53.374325 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:53 crc kubenswrapper[4795]: I0219 21:53:53.426277 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:54 crc kubenswrapper[4795]: I0219 21:53:54.113616 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:54 crc kubenswrapper[4795]: I0219 21:53:54.182001 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-crhfj"] Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.083076 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-crhfj" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerName="registry-server" containerID="cri-o://0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841" gracePeriod=2 Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.560930 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.661947 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v68jd\" (UniqueName: \"kubernetes.io/projected/39607a76-451a-4cd6-806b-c14c6a94b5ae-kube-api-access-v68jd\") pod \"39607a76-451a-4cd6-806b-c14c6a94b5ae\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.662004 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-utilities\") pod \"39607a76-451a-4cd6-806b-c14c6a94b5ae\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.662023 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-catalog-content\") pod \"39607a76-451a-4cd6-806b-c14c6a94b5ae\" (UID: \"39607a76-451a-4cd6-806b-c14c6a94b5ae\") " Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.663225 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-utilities" (OuterVolumeSpecName: "utilities") pod "39607a76-451a-4cd6-806b-c14c6a94b5ae" (UID: "39607a76-451a-4cd6-806b-c14c6a94b5ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.678057 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39607a76-451a-4cd6-806b-c14c6a94b5ae-kube-api-access-v68jd" (OuterVolumeSpecName: "kube-api-access-v68jd") pod "39607a76-451a-4cd6-806b-c14c6a94b5ae" (UID: "39607a76-451a-4cd6-806b-c14c6a94b5ae"). InnerVolumeSpecName "kube-api-access-v68jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.717287 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39607a76-451a-4cd6-806b-c14c6a94b5ae" (UID: "39607a76-451a-4cd6-806b-c14c6a94b5ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.763741 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v68jd\" (UniqueName: \"kubernetes.io/projected/39607a76-451a-4cd6-806b-c14c6a94b5ae-kube-api-access-v68jd\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.763770 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:56 crc kubenswrapper[4795]: I0219 21:53:56.763780 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39607a76-451a-4cd6-806b-c14c6a94b5ae-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.093535 4795 generic.go:334] "Generic (PLEG): container finished" podID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerID="0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841" exitCode=0 Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.093584 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crhfj" event={"ID":"39607a76-451a-4cd6-806b-c14c6a94b5ae","Type":"ContainerDied","Data":"0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841"} Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.093661 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crhfj" event={"ID":"39607a76-451a-4cd6-806b-c14c6a94b5ae","Type":"ContainerDied","Data":"29c905448e4c99f2240081e71d0b997093c5980cb0276302ef057ea183c10166"} Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.093692 4795 scope.go:117] "RemoveContainer" containerID="0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.093612 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-crhfj" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.126712 4795 scope.go:117] "RemoveContainer" containerID="8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.138658 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-crhfj"] Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.145831 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-crhfj"] Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.151859 4795 scope.go:117] "RemoveContainer" containerID="1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.176308 4795 scope.go:117] "RemoveContainer" containerID="0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841" Feb 19 21:53:57 crc kubenswrapper[4795]: E0219 21:53:57.176922 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841\": container with ID starting with 0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841 not found: ID does not exist" containerID="0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.176972 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841"} err="failed to get container status \"0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841\": rpc error: code = NotFound desc = could not find container \"0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841\": container with ID starting with 0765581b2730bc1923027a88691a1aecae8cf92bf227d9f2c680a8b520ad6841 not found: ID does not exist" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.177002 4795 scope.go:117] "RemoveContainer" containerID="8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04" Feb 19 21:53:57 crc kubenswrapper[4795]: E0219 21:53:57.177302 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04\": container with ID starting with 8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04 not found: ID does not exist" containerID="8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.177335 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04"} err="failed to get container status \"8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04\": rpc error: code = NotFound desc = could not find container \"8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04\": container with ID starting with 8be6ee752fd8728485a58c86ac7f0f5fc78541f658433bfe18cdea07b3e39e04 not found: ID does not exist" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.177353 4795 scope.go:117] "RemoveContainer" containerID="1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb" Feb 19 21:53:57 crc kubenswrapper[4795]: E0219 21:53:57.177631 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb\": container with ID starting with 1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb not found: ID does not exist" containerID="1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.177660 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb"} err="failed to get container status \"1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb\": rpc error: code = NotFound desc = could not find container \"1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb\": container with ID starting with 1ac1040e8221e1eae8c74ba52013471edf1256a7d756330020a1e0f282c811cb not found: ID does not exist" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.512565 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:53:57 crc kubenswrapper[4795]: E0219 21:53:57.512946 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:53:57 crc kubenswrapper[4795]: I0219 21:53:57.522863 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" path="/var/lib/kubelet/pods/39607a76-451a-4cd6-806b-c14c6a94b5ae/volumes" Feb 19 21:54:09 crc kubenswrapper[4795]: I0219 21:54:09.515675 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:54:09 crc kubenswrapper[4795]: E0219 21:54:09.516483 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:54:13 crc kubenswrapper[4795]: I0219 21:54:13.207932 4795 scope.go:117] "RemoveContainer" containerID="820a0240f9371635d4e5f03ad2ffcfa48ce070182eb86809479efd07b7626507" Feb 19 21:54:13 crc kubenswrapper[4795]: I0219 21:54:13.278278 4795 scope.go:117] "RemoveContainer" containerID="e324028f2d7a41155077f14e0d48b2d58c21ebdbf00e4a7e4bd8a8141187b6bd" Feb 19 21:54:13 crc kubenswrapper[4795]: I0219 21:54:13.303769 4795 scope.go:117] "RemoveContainer" containerID="c82c86becfd7208e347af805b27fa40c1a5698022b6509cd540c7832c74ea578" Feb 19 21:54:13 crc kubenswrapper[4795]: I0219 21:54:13.339956 4795 scope.go:117] "RemoveContainer" containerID="5c8a8280c91cc71f3e85d0b7e361ab76b49d5efc392b87ec1bb737d4336102fe" Feb 19 21:54:13 crc kubenswrapper[4795]: I0219 21:54:13.366787 4795 scope.go:117] "RemoveContainer" containerID="54158f46698b83516d68eee860a71369629e83ec491495039fcc2f5f6a5bd317" Feb 19 21:54:13 crc kubenswrapper[4795]: I0219 21:54:13.434872 4795 scope.go:117] "RemoveContainer" containerID="df968256162833d5078e440bc65555f6f0195c60776c40f5962f3cbd9e8c0552" Feb 19 21:54:13 crc kubenswrapper[4795]: I0219 21:54:13.464581 4795 scope.go:117] "RemoveContainer" containerID="a0f7c3f34c80b9a269fbc6418536c8bd187965bff2a033810a317f2452cc049c" Feb 19 21:54:13 crc kubenswrapper[4795]: I0219 21:54:13.496845 4795 scope.go:117] "RemoveContainer" containerID="a5072d3cb490beed9fbcf82cc94732aed7e308348a7a93463173543a97f32666" Feb 19 21:54:23 crc kubenswrapper[4795]: I0219 21:54:23.512628 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:54:23 crc kubenswrapper[4795]: E0219 21:54:23.513707 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:54:36 crc kubenswrapper[4795]: I0219 21:54:36.511505 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:54:36 crc kubenswrapper[4795]: E0219 21:54:36.512332 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:54:49 crc kubenswrapper[4795]: I0219 21:54:49.520835 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:54:49 crc kubenswrapper[4795]: E0219 21:54:49.522144 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:55:03 crc kubenswrapper[4795]: I0219 21:55:03.512215 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:55:03 crc kubenswrapper[4795]: E0219 21:55:03.513343 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:55:13 crc kubenswrapper[4795]: I0219 21:55:13.685742 4795 scope.go:117] "RemoveContainer" containerID="2f19263c7946bfa6317a4079eb988e89329f3723a790f4116428242859b8575d" Feb 19 21:55:13 crc kubenswrapper[4795]: I0219 21:55:13.730485 4795 scope.go:117] "RemoveContainer" containerID="971150e9373b5419fd5efc7fc3e78faafd9906f987cb7e0a9e042f04e22cb5a9" Feb 19 21:55:17 crc kubenswrapper[4795]: I0219 21:55:17.511124 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:55:17 crc kubenswrapper[4795]: E0219 21:55:17.511717 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:55:29 crc kubenswrapper[4795]: I0219 21:55:29.521351 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:55:29 crc kubenswrapper[4795]: E0219 21:55:29.522789 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:55:42 crc kubenswrapper[4795]: I0219 21:55:42.511799 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:55:42 crc kubenswrapper[4795]: E0219 21:55:42.512801 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:55:54 crc kubenswrapper[4795]: I0219 21:55:54.511591 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:55:54 crc kubenswrapper[4795]: E0219 21:55:54.512899 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:56:08 crc kubenswrapper[4795]: I0219 21:56:08.512071 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:56:08 crc kubenswrapper[4795]: E0219 21:56:08.513141 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:56:23 crc kubenswrapper[4795]: I0219 21:56:23.514332 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:56:23 crc kubenswrapper[4795]: E0219 21:56:23.515057 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:56:36 crc kubenswrapper[4795]: I0219 21:56:36.511334 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:56:36 crc kubenswrapper[4795]: E0219 21:56:36.512500 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:56:47 crc kubenswrapper[4795]: I0219 21:56:47.511580 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:56:47 crc kubenswrapper[4795]: E0219 21:56:47.512645 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:56:59 crc kubenswrapper[4795]: I0219 21:56:59.514958 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:56:59 crc kubenswrapper[4795]: E0219 21:56:59.515681 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:57:14 crc kubenswrapper[4795]: I0219 21:57:14.511899 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:57:14 crc kubenswrapper[4795]: E0219 21:57:14.512987 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:57:26 crc kubenswrapper[4795]: I0219 21:57:26.516192 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:57:26 crc kubenswrapper[4795]: E0219 21:57:26.516961 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:57:37 crc kubenswrapper[4795]: I0219 21:57:37.512018 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:57:37 crc kubenswrapper[4795]: E0219 21:57:37.512955 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:57:51 crc kubenswrapper[4795]: I0219 21:57:51.512607 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:57:51 crc kubenswrapper[4795]: E0219 21:57:51.513931 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:58:05 crc kubenswrapper[4795]: I0219 21:58:05.512240 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:58:05 crc kubenswrapper[4795]: E0219 21:58:05.513015 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:58:20 crc kubenswrapper[4795]: I0219 21:58:20.511523 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:58:20 crc kubenswrapper[4795]: E0219 21:58:20.512376 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 21:58:31 crc kubenswrapper[4795]: I0219 21:58:31.513873 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 21:58:32 crc kubenswrapper[4795]: I0219 21:58:32.420960 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"6c01aaf6e3d8e8204ba05a7b4dc2e4ab1c48baf63fa06a4eaffcb4f1cd336768"} Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.068195 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xkgwx"] Feb 19 21:59:54 crc kubenswrapper[4795]: E0219 21:59:54.069049 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerName="registry-server" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.069063 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerName="registry-server" Feb 19 21:59:54 crc kubenswrapper[4795]: E0219 21:59:54.069078 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerName="extract-content" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.069083 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerName="extract-content" Feb 19 21:59:54 crc kubenswrapper[4795]: E0219 21:59:54.069099 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerName="extract-utilities" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.069105 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerName="extract-utilities" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.069261 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="39607a76-451a-4cd6-806b-c14c6a94b5ae" containerName="registry-server" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.070227 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.093866 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkgwx"] Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.155005 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwx5l\" (UniqueName: \"kubernetes.io/projected/4814500d-f15e-4457-ad1b-24ae2f076b47-kube-api-access-gwx5l\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.155056 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-catalog-content\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.155115 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-utilities\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.256595 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwx5l\" (UniqueName: \"kubernetes.io/projected/4814500d-f15e-4457-ad1b-24ae2f076b47-kube-api-access-gwx5l\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.256647 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-catalog-content\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.256704 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-utilities\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.257496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-utilities\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.257646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-catalog-content\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.275227 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwx5l\" (UniqueName: \"kubernetes.io/projected/4814500d-f15e-4457-ad1b-24ae2f076b47-kube-api-access-gwx5l\") pod \"redhat-operators-xkgwx\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.389750 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 21:59:54 crc kubenswrapper[4795]: I0219 21:59:54.872132 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkgwx"] Feb 19 21:59:54 crc kubenswrapper[4795]: W0219 21:59:54.873600 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4814500d_f15e_4457_ad1b_24ae2f076b47.slice/crio-af917032b74b739b028f2d7dd43f2a3a18d1a2be6f96141ba995f6845233db95 WatchSource:0}: Error finding container af917032b74b739b028f2d7dd43f2a3a18d1a2be6f96141ba995f6845233db95: Status 404 returned error can't find the container with id af917032b74b739b028f2d7dd43f2a3a18d1a2be6f96141ba995f6845233db95 Feb 19 21:59:55 crc kubenswrapper[4795]: I0219 21:59:55.110181 4795 generic.go:334] "Generic (PLEG): container finished" podID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerID="e5fb53341d7b3ce79711e4a20d06ba6ff9336ef38668b0ff7fb9994f9fff1772" exitCode=0 Feb 19 21:59:55 crc kubenswrapper[4795]: I0219 21:59:55.110241 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkgwx" event={"ID":"4814500d-f15e-4457-ad1b-24ae2f076b47","Type":"ContainerDied","Data":"e5fb53341d7b3ce79711e4a20d06ba6ff9336ef38668b0ff7fb9994f9fff1772"} Feb 19 21:59:55 crc kubenswrapper[4795]: I0219 21:59:55.110460 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkgwx" event={"ID":"4814500d-f15e-4457-ad1b-24ae2f076b47","Type":"ContainerStarted","Data":"af917032b74b739b028f2d7dd43f2a3a18d1a2be6f96141ba995f6845233db95"} Feb 19 21:59:55 crc kubenswrapper[4795]: I0219 21:59:55.112409 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 21:59:57 crc kubenswrapper[4795]: I0219 21:59:57.134084 4795 generic.go:334] "Generic (PLEG): container finished" podID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerID="7c2c45c677e30510633a93f2ad571caf052ed8a68f47e0d48c10e86c9f5c5898" exitCode=0 Feb 19 21:59:57 crc kubenswrapper[4795]: I0219 21:59:57.134131 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkgwx" event={"ID":"4814500d-f15e-4457-ad1b-24ae2f076b47","Type":"ContainerDied","Data":"7c2c45c677e30510633a93f2ad571caf052ed8a68f47e0d48c10e86c9f5c5898"} Feb 19 21:59:58 crc kubenswrapper[4795]: I0219 21:59:58.144728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkgwx" event={"ID":"4814500d-f15e-4457-ad1b-24ae2f076b47","Type":"ContainerStarted","Data":"e2352e213a74129ddfe1ba1875c8613d2ec7a14ae0ec2f089d5460d87f2543db"} Feb 19 21:59:58 crc kubenswrapper[4795]: I0219 21:59:58.171066 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xkgwx" podStartSLOduration=1.7317850890000002 podStartE2EDuration="4.171031281s" podCreationTimestamp="2026-02-19 21:59:54 +0000 UTC" firstStartedPulling="2026-02-19 21:59:55.112191625 +0000 UTC m=+1906.304709479" lastFinishedPulling="2026-02-19 21:59:57.551437797 +0000 UTC m=+1908.743955671" observedRunningTime="2026-02-19 21:59:58.16603065 +0000 UTC m=+1909.358548584" watchObservedRunningTime="2026-02-19 21:59:58.171031281 +0000 UTC m=+1909.363549155" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.146272 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt"] Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.147691 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.150099 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.152495 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.188470 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt"] Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.237966 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6bba469-9e7c-4517-bc8d-2d5a5308edef-config-volume\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.238073 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv7tf\" (UniqueName: \"kubernetes.io/projected/b6bba469-9e7c-4517-bc8d-2d5a5308edef-kube-api-access-gv7tf\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.238105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6bba469-9e7c-4517-bc8d-2d5a5308edef-secret-volume\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.339607 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6bba469-9e7c-4517-bc8d-2d5a5308edef-config-volume\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.339758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv7tf\" (UniqueName: \"kubernetes.io/projected/b6bba469-9e7c-4517-bc8d-2d5a5308edef-kube-api-access-gv7tf\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.339808 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6bba469-9e7c-4517-bc8d-2d5a5308edef-secret-volume\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.340848 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6bba469-9e7c-4517-bc8d-2d5a5308edef-config-volume\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.346463 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6bba469-9e7c-4517-bc8d-2d5a5308edef-secret-volume\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.357325 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv7tf\" (UniqueName: \"kubernetes.io/projected/b6bba469-9e7c-4517-bc8d-2d5a5308edef-kube-api-access-gv7tf\") pod \"collect-profiles-29525640-x6xrt\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.471889 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:00 crc kubenswrapper[4795]: I0219 22:00:00.910332 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt"] Feb 19 22:00:01 crc kubenswrapper[4795]: I0219 22:00:01.166142 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" event={"ID":"b6bba469-9e7c-4517-bc8d-2d5a5308edef","Type":"ContainerStarted","Data":"32bd4a96cb785a6511c7d0788e6684b8612ce255e3204d11b76077aa3fe75418"} Feb 19 22:00:01 crc kubenswrapper[4795]: I0219 22:00:01.178479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" event={"ID":"b6bba469-9e7c-4517-bc8d-2d5a5308edef","Type":"ContainerStarted","Data":"ec59d844647e5cfdac2fe8c8c762677b385f33bbd1b22bc0652c9d827737d3fd"} Feb 19 22:00:01 crc kubenswrapper[4795]: I0219 22:00:01.202899 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" podStartSLOduration=1.202879485 podStartE2EDuration="1.202879485s" podCreationTimestamp="2026-02-19 22:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:00:01.198028129 +0000 UTC m=+1912.390546003" watchObservedRunningTime="2026-02-19 22:00:01.202879485 +0000 UTC m=+1912.395397369" Feb 19 22:00:01 crc kubenswrapper[4795]: E0219 22:00:01.322261 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6bba469_9e7c_4517_bc8d_2d5a5308edef.slice/crio-32bd4a96cb785a6511c7d0788e6684b8612ce255e3204d11b76077aa3fe75418.scope\": RecentStats: unable to find data in memory cache]" Feb 19 22:00:02 crc kubenswrapper[4795]: I0219 22:00:02.175966 4795 generic.go:334] "Generic (PLEG): container finished" podID="b6bba469-9e7c-4517-bc8d-2d5a5308edef" containerID="32bd4a96cb785a6511c7d0788e6684b8612ce255e3204d11b76077aa3fe75418" exitCode=0 Feb 19 22:00:02 crc kubenswrapper[4795]: I0219 22:00:02.176026 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" event={"ID":"b6bba469-9e7c-4517-bc8d-2d5a5308edef","Type":"ContainerDied","Data":"32bd4a96cb785a6511c7d0788e6684b8612ce255e3204d11b76077aa3fe75418"} Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.445215 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.583835 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv7tf\" (UniqueName: \"kubernetes.io/projected/b6bba469-9e7c-4517-bc8d-2d5a5308edef-kube-api-access-gv7tf\") pod \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.584008 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6bba469-9e7c-4517-bc8d-2d5a5308edef-secret-volume\") pod \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.584716 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6bba469-9e7c-4517-bc8d-2d5a5308edef-config-volume\") pod \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\" (UID: \"b6bba469-9e7c-4517-bc8d-2d5a5308edef\") " Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.585321 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6bba469-9e7c-4517-bc8d-2d5a5308edef-config-volume" (OuterVolumeSpecName: "config-volume") pod "b6bba469-9e7c-4517-bc8d-2d5a5308edef" (UID: "b6bba469-9e7c-4517-bc8d-2d5a5308edef"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.588995 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6bba469-9e7c-4517-bc8d-2d5a5308edef-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b6bba469-9e7c-4517-bc8d-2d5a5308edef" (UID: "b6bba469-9e7c-4517-bc8d-2d5a5308edef"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.589472 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6bba469-9e7c-4517-bc8d-2d5a5308edef-kube-api-access-gv7tf" (OuterVolumeSpecName: "kube-api-access-gv7tf") pod "b6bba469-9e7c-4517-bc8d-2d5a5308edef" (UID: "b6bba469-9e7c-4517-bc8d-2d5a5308edef"). InnerVolumeSpecName "kube-api-access-gv7tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.686408 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv7tf\" (UniqueName: \"kubernetes.io/projected/b6bba469-9e7c-4517-bc8d-2d5a5308edef-kube-api-access-gv7tf\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.686459 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6bba469-9e7c-4517-bc8d-2d5a5308edef-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:03 crc kubenswrapper[4795]: I0219 22:00:03.686474 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6bba469-9e7c-4517-bc8d-2d5a5308edef-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:04 crc kubenswrapper[4795]: I0219 22:00:04.191272 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" event={"ID":"b6bba469-9e7c-4517-bc8d-2d5a5308edef","Type":"ContainerDied","Data":"ec59d844647e5cfdac2fe8c8c762677b385f33bbd1b22bc0652c9d827737d3fd"} Feb 19 22:00:04 crc kubenswrapper[4795]: I0219 22:00:04.191328 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec59d844647e5cfdac2fe8c8c762677b385f33bbd1b22bc0652c9d827737d3fd" Feb 19 22:00:04 crc kubenswrapper[4795]: I0219 22:00:04.191626 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt" Feb 19 22:00:04 crc kubenswrapper[4795]: I0219 22:00:04.390897 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 22:00:04 crc kubenswrapper[4795]: I0219 22:00:04.390955 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 22:00:04 crc kubenswrapper[4795]: I0219 22:00:04.435143 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 22:00:05 crc kubenswrapper[4795]: I0219 22:00:05.246671 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 22:00:06 crc kubenswrapper[4795]: I0219 22:00:06.236026 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkgwx"] Feb 19 22:00:07 crc kubenswrapper[4795]: I0219 22:00:07.215078 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xkgwx" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerName="registry-server" containerID="cri-o://e2352e213a74129ddfe1ba1875c8613d2ec7a14ae0ec2f089d5460d87f2543db" gracePeriod=2 Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.238364 4795 generic.go:334] "Generic (PLEG): container finished" podID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerID="e2352e213a74129ddfe1ba1875c8613d2ec7a14ae0ec2f089d5460d87f2543db" exitCode=0 Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.238450 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkgwx" event={"ID":"4814500d-f15e-4457-ad1b-24ae2f076b47","Type":"ContainerDied","Data":"e2352e213a74129ddfe1ba1875c8613d2ec7a14ae0ec2f089d5460d87f2543db"} Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.504327 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.583030 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwx5l\" (UniqueName: \"kubernetes.io/projected/4814500d-f15e-4457-ad1b-24ae2f076b47-kube-api-access-gwx5l\") pod \"4814500d-f15e-4457-ad1b-24ae2f076b47\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.583135 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-catalog-content\") pod \"4814500d-f15e-4457-ad1b-24ae2f076b47\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.583286 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-utilities\") pod \"4814500d-f15e-4457-ad1b-24ae2f076b47\" (UID: \"4814500d-f15e-4457-ad1b-24ae2f076b47\") " Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.585267 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-utilities" (OuterVolumeSpecName: "utilities") pod "4814500d-f15e-4457-ad1b-24ae2f076b47" (UID: "4814500d-f15e-4457-ad1b-24ae2f076b47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.595467 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4814500d-f15e-4457-ad1b-24ae2f076b47-kube-api-access-gwx5l" (OuterVolumeSpecName: "kube-api-access-gwx5l") pod "4814500d-f15e-4457-ad1b-24ae2f076b47" (UID: "4814500d-f15e-4457-ad1b-24ae2f076b47"). InnerVolumeSpecName "kube-api-access-gwx5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.687428 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.687490 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwx5l\" (UniqueName: \"kubernetes.io/projected/4814500d-f15e-4457-ad1b-24ae2f076b47-kube-api-access-gwx5l\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.708511 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4814500d-f15e-4457-ad1b-24ae2f076b47" (UID: "4814500d-f15e-4457-ad1b-24ae2f076b47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:00:09 crc kubenswrapper[4795]: I0219 22:00:09.788666 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4814500d-f15e-4457-ad1b-24ae2f076b47-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:10 crc kubenswrapper[4795]: I0219 22:00:10.247867 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkgwx" event={"ID":"4814500d-f15e-4457-ad1b-24ae2f076b47","Type":"ContainerDied","Data":"af917032b74b739b028f2d7dd43f2a3a18d1a2be6f96141ba995f6845233db95"} Feb 19 22:00:10 crc kubenswrapper[4795]: I0219 22:00:10.247910 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkgwx" Feb 19 22:00:10 crc kubenswrapper[4795]: I0219 22:00:10.247923 4795 scope.go:117] "RemoveContainer" containerID="e2352e213a74129ddfe1ba1875c8613d2ec7a14ae0ec2f089d5460d87f2543db" Feb 19 22:00:10 crc kubenswrapper[4795]: I0219 22:00:10.287944 4795 scope.go:117] "RemoveContainer" containerID="7c2c45c677e30510633a93f2ad571caf052ed8a68f47e0d48c10e86c9f5c5898" Feb 19 22:00:10 crc kubenswrapper[4795]: I0219 22:00:10.292826 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkgwx"] Feb 19 22:00:10 crc kubenswrapper[4795]: I0219 22:00:10.303619 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xkgwx"] Feb 19 22:00:10 crc kubenswrapper[4795]: I0219 22:00:10.313127 4795 scope.go:117] "RemoveContainer" containerID="e5fb53341d7b3ce79711e4a20d06ba6ff9336ef38668b0ff7fb9994f9fff1772" Feb 19 22:00:11 crc kubenswrapper[4795]: I0219 22:00:11.519151 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" path="/var/lib/kubelet/pods/4814500d-f15e-4457-ad1b-24ae2f076b47/volumes" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.140361 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nth7v"] Feb 19 22:00:31 crc kubenswrapper[4795]: E0219 22:00:31.141247 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerName="registry-server" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.141264 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerName="registry-server" Feb 19 22:00:31 crc kubenswrapper[4795]: E0219 22:00:31.141277 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerName="extract-content" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.141284 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerName="extract-content" Feb 19 22:00:31 crc kubenswrapper[4795]: E0219 22:00:31.141295 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerName="extract-utilities" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.141305 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerName="extract-utilities" Feb 19 22:00:31 crc kubenswrapper[4795]: E0219 22:00:31.141315 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6bba469-9e7c-4517-bc8d-2d5a5308edef" containerName="collect-profiles" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.141322 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6bba469-9e7c-4517-bc8d-2d5a5308edef" containerName="collect-profiles" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.141498 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4814500d-f15e-4457-ad1b-24ae2f076b47" containerName="registry-server" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.141519 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6bba469-9e7c-4517-bc8d-2d5a5308edef" containerName="collect-profiles" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.142840 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.147053 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nth7v"] Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.293506 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt6rq\" (UniqueName: \"kubernetes.io/projected/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-kube-api-access-lt6rq\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.293848 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-catalog-content\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.293958 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-utilities\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.394941 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt6rq\" (UniqueName: \"kubernetes.io/projected/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-kube-api-access-lt6rq\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.395010 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-catalog-content\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.395039 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-utilities\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.395592 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-catalog-content\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.395643 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-utilities\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.417459 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt6rq\" (UniqueName: \"kubernetes.io/projected/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-kube-api-access-lt6rq\") pod \"community-operators-nth7v\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.465551 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.928961 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lgk8z"] Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.930606 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:31 crc kubenswrapper[4795]: I0219 22:00:31.959901 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lgk8z"] Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.004457 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttm5g\" (UniqueName: \"kubernetes.io/projected/81fba4b2-8284-4218-848a-d969914c88d4-kube-api-access-ttm5g\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.004560 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-catalog-content\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.004587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-utilities\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.004669 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nth7v"] Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.105426 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-catalog-content\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.105466 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-utilities\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.105526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttm5g\" (UniqueName: \"kubernetes.io/projected/81fba4b2-8284-4218-848a-d969914c88d4-kube-api-access-ttm5g\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.105948 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-catalog-content\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.106063 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-utilities\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.125390 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttm5g\" (UniqueName: \"kubernetes.io/projected/81fba4b2-8284-4218-848a-d969914c88d4-kube-api-access-ttm5g\") pod \"redhat-marketplace-lgk8z\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.280746 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.530601 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lgk8z"] Feb 19 22:00:32 crc kubenswrapper[4795]: W0219 22:00:32.534409 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81fba4b2_8284_4218_848a_d969914c88d4.slice/crio-c6acdb6aba45ef8ff16e31b1bbf6b17397cea11d384ef65816c436f0d3dba5b7 WatchSource:0}: Error finding container c6acdb6aba45ef8ff16e31b1bbf6b17397cea11d384ef65816c436f0d3dba5b7: Status 404 returned error can't find the container with id c6acdb6aba45ef8ff16e31b1bbf6b17397cea11d384ef65816c436f0d3dba5b7 Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.607040 4795 generic.go:334] "Generic (PLEG): container finished" podID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerID="1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d" exitCode=0 Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.607096 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nth7v" event={"ID":"e21c86a5-45d5-4a66-ab91-2c5f63ed9560","Type":"ContainerDied","Data":"1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d"} Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.607408 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nth7v" event={"ID":"e21c86a5-45d5-4a66-ab91-2c5f63ed9560","Type":"ContainerStarted","Data":"67faa7fa182c32f61a7ee9cd36ad3cb7441940baaa79652676306e2045363915"} Feb 19 22:00:32 crc kubenswrapper[4795]: I0219 22:00:32.609983 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lgk8z" event={"ID":"81fba4b2-8284-4218-848a-d969914c88d4","Type":"ContainerStarted","Data":"c6acdb6aba45ef8ff16e31b1bbf6b17397cea11d384ef65816c436f0d3dba5b7"} Feb 19 22:00:33 crc kubenswrapper[4795]: I0219 22:00:33.618581 4795 generic.go:334] "Generic (PLEG): container finished" podID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerID="0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60" exitCode=0 Feb 19 22:00:33 crc kubenswrapper[4795]: I0219 22:00:33.618663 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nth7v" event={"ID":"e21c86a5-45d5-4a66-ab91-2c5f63ed9560","Type":"ContainerDied","Data":"0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60"} Feb 19 22:00:33 crc kubenswrapper[4795]: I0219 22:00:33.620159 4795 generic.go:334] "Generic (PLEG): container finished" podID="81fba4b2-8284-4218-848a-d969914c88d4" containerID="d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3" exitCode=0 Feb 19 22:00:33 crc kubenswrapper[4795]: I0219 22:00:33.620248 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lgk8z" event={"ID":"81fba4b2-8284-4218-848a-d969914c88d4","Type":"ContainerDied","Data":"d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3"} Feb 19 22:00:34 crc kubenswrapper[4795]: I0219 22:00:34.628624 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nth7v" event={"ID":"e21c86a5-45d5-4a66-ab91-2c5f63ed9560","Type":"ContainerStarted","Data":"5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837"} Feb 19 22:00:34 crc kubenswrapper[4795]: I0219 22:00:34.630117 4795 generic.go:334] "Generic (PLEG): container finished" podID="81fba4b2-8284-4218-848a-d969914c88d4" containerID="21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0" exitCode=0 Feb 19 22:00:34 crc kubenswrapper[4795]: I0219 22:00:34.630178 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lgk8z" event={"ID":"81fba4b2-8284-4218-848a-d969914c88d4","Type":"ContainerDied","Data":"21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0"} Feb 19 22:00:34 crc kubenswrapper[4795]: I0219 22:00:34.663806 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nth7v" podStartSLOduration=2.166059846 podStartE2EDuration="3.663789099s" podCreationTimestamp="2026-02-19 22:00:31 +0000 UTC" firstStartedPulling="2026-02-19 22:00:32.608866156 +0000 UTC m=+1943.801384020" lastFinishedPulling="2026-02-19 22:00:34.106595389 +0000 UTC m=+1945.299113273" observedRunningTime="2026-02-19 22:00:34.655608903 +0000 UTC m=+1945.848126767" watchObservedRunningTime="2026-02-19 22:00:34.663789099 +0000 UTC m=+1945.856306963" Feb 19 22:00:35 crc kubenswrapper[4795]: I0219 22:00:35.638150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lgk8z" event={"ID":"81fba4b2-8284-4218-848a-d969914c88d4","Type":"ContainerStarted","Data":"2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d"} Feb 19 22:00:35 crc kubenswrapper[4795]: I0219 22:00:35.656940 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lgk8z" podStartSLOduration=3.263142377 podStartE2EDuration="4.656923198s" podCreationTimestamp="2026-02-19 22:00:31 +0000 UTC" firstStartedPulling="2026-02-19 22:00:33.621426771 +0000 UTC m=+1944.813944635" lastFinishedPulling="2026-02-19 22:00:35.015207472 +0000 UTC m=+1946.207725456" observedRunningTime="2026-02-19 22:00:35.653478783 +0000 UTC m=+1946.845996647" watchObservedRunningTime="2026-02-19 22:00:35.656923198 +0000 UTC m=+1946.849441062" Feb 19 22:00:41 crc kubenswrapper[4795]: I0219 22:00:41.465860 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:41 crc kubenswrapper[4795]: I0219 22:00:41.466342 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:41 crc kubenswrapper[4795]: I0219 22:00:41.520953 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:41 crc kubenswrapper[4795]: I0219 22:00:41.713414 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:41 crc kubenswrapper[4795]: I0219 22:00:41.755933 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nth7v"] Feb 19 22:00:42 crc kubenswrapper[4795]: I0219 22:00:42.280995 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:42 crc kubenswrapper[4795]: I0219 22:00:42.281352 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:42 crc kubenswrapper[4795]: I0219 22:00:42.330537 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:42 crc kubenswrapper[4795]: I0219 22:00:42.717874 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:43 crc kubenswrapper[4795]: I0219 22:00:43.681446 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nth7v" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerName="registry-server" containerID="cri-o://5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837" gracePeriod=2 Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.142240 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.152241 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lgk8z"] Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.313508 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-catalog-content\") pod \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.313611 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt6rq\" (UniqueName: \"kubernetes.io/projected/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-kube-api-access-lt6rq\") pod \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.313768 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-utilities\") pod \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\" (UID: \"e21c86a5-45d5-4a66-ab91-2c5f63ed9560\") " Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.314755 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-utilities" (OuterVolumeSpecName: "utilities") pod "e21c86a5-45d5-4a66-ab91-2c5f63ed9560" (UID: "e21c86a5-45d5-4a66-ab91-2c5f63ed9560"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.327388 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-kube-api-access-lt6rq" (OuterVolumeSpecName: "kube-api-access-lt6rq") pod "e21c86a5-45d5-4a66-ab91-2c5f63ed9560" (UID: "e21c86a5-45d5-4a66-ab91-2c5f63ed9560"). InnerVolumeSpecName "kube-api-access-lt6rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.415908 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.415946 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt6rq\" (UniqueName: \"kubernetes.io/projected/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-kube-api-access-lt6rq\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.582401 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e21c86a5-45d5-4a66-ab91-2c5f63ed9560" (UID: "e21c86a5-45d5-4a66-ab91-2c5f63ed9560"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.619124 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e21c86a5-45d5-4a66-ab91-2c5f63ed9560-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.694428 4795 generic.go:334] "Generic (PLEG): container finished" podID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerID="5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837" exitCode=0 Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.694504 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nth7v" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.694588 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nth7v" event={"ID":"e21c86a5-45d5-4a66-ab91-2c5f63ed9560","Type":"ContainerDied","Data":"5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837"} Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.694706 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nth7v" event={"ID":"e21c86a5-45d5-4a66-ab91-2c5f63ed9560","Type":"ContainerDied","Data":"67faa7fa182c32f61a7ee9cd36ad3cb7441940baaa79652676306e2045363915"} Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.694747 4795 scope.go:117] "RemoveContainer" containerID="5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.722677 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nth7v"] Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.727944 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nth7v"] Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.739658 4795 scope.go:117] "RemoveContainer" containerID="0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.770462 4795 scope.go:117] "RemoveContainer" containerID="1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.795689 4795 scope.go:117] "RemoveContainer" containerID="5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837" Feb 19 22:00:44 crc kubenswrapper[4795]: E0219 22:00:44.796293 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837\": container with ID starting with 5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837 not found: ID does not exist" containerID="5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.796341 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837"} err="failed to get container status \"5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837\": rpc error: code = NotFound desc = could not find container \"5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837\": container with ID starting with 5e21c4ae1f94887d1d9db8a6574b2bc29b068ff8a0821dd49fcbb44217347837 not found: ID does not exist" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.796369 4795 scope.go:117] "RemoveContainer" containerID="0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60" Feb 19 22:00:44 crc kubenswrapper[4795]: E0219 22:00:44.796892 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60\": container with ID starting with 0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60 not found: ID does not exist" containerID="0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.796922 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60"} err="failed to get container status \"0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60\": rpc error: code = NotFound desc = could not find container \"0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60\": container with ID starting with 0d72bd7cfa5eaabd5dc9ed776be344a38d94fc2622735fd266c9e88dd232bb60 not found: ID does not exist" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.796940 4795 scope.go:117] "RemoveContainer" containerID="1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d" Feb 19 22:00:44 crc kubenswrapper[4795]: E0219 22:00:44.797532 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d\": container with ID starting with 1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d not found: ID does not exist" containerID="1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d" Feb 19 22:00:44 crc kubenswrapper[4795]: I0219 22:00:44.797559 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d"} err="failed to get container status \"1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d\": rpc error: code = NotFound desc = could not find container \"1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d\": container with ID starting with 1eb7b71306343f2aaf19d2656ab650c032b78bd6319b8c34e6b6bb82e121576d not found: ID does not exist" Feb 19 22:00:45 crc kubenswrapper[4795]: I0219 22:00:45.527060 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" path="/var/lib/kubelet/pods/e21c86a5-45d5-4a66-ab91-2c5f63ed9560/volumes" Feb 19 22:00:45 crc kubenswrapper[4795]: I0219 22:00:45.702782 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lgk8z" podUID="81fba4b2-8284-4218-848a-d969914c88d4" containerName="registry-server" containerID="cri-o://2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d" gracePeriod=2 Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.143307 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.247813 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-utilities\") pod \"81fba4b2-8284-4218-848a-d969914c88d4\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.247874 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttm5g\" (UniqueName: \"kubernetes.io/projected/81fba4b2-8284-4218-848a-d969914c88d4-kube-api-access-ttm5g\") pod \"81fba4b2-8284-4218-848a-d969914c88d4\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.247984 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-catalog-content\") pod \"81fba4b2-8284-4218-848a-d969914c88d4\" (UID: \"81fba4b2-8284-4218-848a-d969914c88d4\") " Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.249462 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-utilities" (OuterVolumeSpecName: "utilities") pod "81fba4b2-8284-4218-848a-d969914c88d4" (UID: "81fba4b2-8284-4218-848a-d969914c88d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.255364 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81fba4b2-8284-4218-848a-d969914c88d4-kube-api-access-ttm5g" (OuterVolumeSpecName: "kube-api-access-ttm5g") pod "81fba4b2-8284-4218-848a-d969914c88d4" (UID: "81fba4b2-8284-4218-848a-d969914c88d4"). InnerVolumeSpecName "kube-api-access-ttm5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.280579 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81fba4b2-8284-4218-848a-d969914c88d4" (UID: "81fba4b2-8284-4218-848a-d969914c88d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.350271 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.350306 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttm5g\" (UniqueName: \"kubernetes.io/projected/81fba4b2-8284-4218-848a-d969914c88d4-kube-api-access-ttm5g\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.350319 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81fba4b2-8284-4218-848a-d969914c88d4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.711053 4795 generic.go:334] "Generic (PLEG): container finished" podID="81fba4b2-8284-4218-848a-d969914c88d4" containerID="2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d" exitCode=0 Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.711093 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lgk8z" event={"ID":"81fba4b2-8284-4218-848a-d969914c88d4","Type":"ContainerDied","Data":"2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d"} Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.711119 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lgk8z" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.711138 4795 scope.go:117] "RemoveContainer" containerID="2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.711126 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lgk8z" event={"ID":"81fba4b2-8284-4218-848a-d969914c88d4","Type":"ContainerDied","Data":"c6acdb6aba45ef8ff16e31b1bbf6b17397cea11d384ef65816c436f0d3dba5b7"} Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.726010 4795 scope.go:117] "RemoveContainer" containerID="21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.743877 4795 scope.go:117] "RemoveContainer" containerID="d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.761561 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lgk8z"] Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.769270 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lgk8z"] Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.775559 4795 scope.go:117] "RemoveContainer" containerID="2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d" Feb 19 22:00:46 crc kubenswrapper[4795]: E0219 22:00:46.778565 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d\": container with ID starting with 2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d not found: ID does not exist" containerID="2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.778594 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d"} err="failed to get container status \"2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d\": rpc error: code = NotFound desc = could not find container \"2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d\": container with ID starting with 2844950df5dd099841ece43f4f4f6bf715e39808291749840a5ed01ccbf8f97d not found: ID does not exist" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.778616 4795 scope.go:117] "RemoveContainer" containerID="21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0" Feb 19 22:00:46 crc kubenswrapper[4795]: E0219 22:00:46.778922 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0\": container with ID starting with 21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0 not found: ID does not exist" containerID="21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.778943 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0"} err="failed to get container status \"21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0\": rpc error: code = NotFound desc = could not find container \"21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0\": container with ID starting with 21993cb96709bea6b105a0765c593584826bed846b84570a65a933931c1dcaa0 not found: ID does not exist" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.778957 4795 scope.go:117] "RemoveContainer" containerID="d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3" Feb 19 22:00:46 crc kubenswrapper[4795]: E0219 22:00:46.779206 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3\": container with ID starting with d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3 not found: ID does not exist" containerID="d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3" Feb 19 22:00:46 crc kubenswrapper[4795]: I0219 22:00:46.779230 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3"} err="failed to get container status \"d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3\": rpc error: code = NotFound desc = could not find container \"d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3\": container with ID starting with d2312f72c72ef688566d86b7fdb387a6daba57278a8dbfb214017e427ccccef3 not found: ID does not exist" Feb 19 22:00:47 crc kubenswrapper[4795]: I0219 22:00:47.522881 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81fba4b2-8284-4218-848a-d969914c88d4" path="/var/lib/kubelet/pods/81fba4b2-8284-4218-848a-d969914c88d4/volumes" Feb 19 22:00:58 crc kubenswrapper[4795]: I0219 22:00:58.428044 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:00:58 crc kubenswrapper[4795]: I0219 22:00:58.428708 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:01:28 crc kubenswrapper[4795]: I0219 22:01:28.427948 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:01:28 crc kubenswrapper[4795]: I0219 22:01:28.428512 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:01:58 crc kubenswrapper[4795]: I0219 22:01:58.427720 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:01:58 crc kubenswrapper[4795]: I0219 22:01:58.428267 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:01:58 crc kubenswrapper[4795]: I0219 22:01:58.428312 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:01:58 crc kubenswrapper[4795]: I0219 22:01:58.428922 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c01aaf6e3d8e8204ba05a7b4dc2e4ab1c48baf63fa06a4eaffcb4f1cd336768"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:01:58 crc kubenswrapper[4795]: I0219 22:01:58.428981 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://6c01aaf6e3d8e8204ba05a7b4dc2e4ab1c48baf63fa06a4eaffcb4f1cd336768" gracePeriod=600 Feb 19 22:01:59 crc kubenswrapper[4795]: I0219 22:01:59.222318 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="6c01aaf6e3d8e8204ba05a7b4dc2e4ab1c48baf63fa06a4eaffcb4f1cd336768" exitCode=0 Feb 19 22:01:59 crc kubenswrapper[4795]: I0219 22:01:59.222425 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"6c01aaf6e3d8e8204ba05a7b4dc2e4ab1c48baf63fa06a4eaffcb4f1cd336768"} Feb 19 22:01:59 crc kubenswrapper[4795]: I0219 22:01:59.222930 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0"} Feb 19 22:01:59 crc kubenswrapper[4795]: I0219 22:01:59.222951 4795 scope.go:117] "RemoveContainer" containerID="89a8591a1a6c5a44263990a6f7f07e1dd5fbf859d704de1710031664de85cae8" Feb 19 22:03:58 crc kubenswrapper[4795]: I0219 22:03:58.428288 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:03:58 crc kubenswrapper[4795]: I0219 22:03:58.429267 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.117549 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m84kt"] Feb 19 22:04:22 crc kubenswrapper[4795]: E0219 22:04:22.118566 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerName="registry-server" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.118584 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerName="registry-server" Feb 19 22:04:22 crc kubenswrapper[4795]: E0219 22:04:22.118600 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fba4b2-8284-4218-848a-d969914c88d4" containerName="extract-content" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.118607 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fba4b2-8284-4218-848a-d969914c88d4" containerName="extract-content" Feb 19 22:04:22 crc kubenswrapper[4795]: E0219 22:04:22.118619 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerName="extract-utilities" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.118626 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerName="extract-utilities" Feb 19 22:04:22 crc kubenswrapper[4795]: E0219 22:04:22.118641 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerName="extract-content" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.118648 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerName="extract-content" Feb 19 22:04:22 crc kubenswrapper[4795]: E0219 22:04:22.118663 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fba4b2-8284-4218-848a-d969914c88d4" containerName="registry-server" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.118671 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fba4b2-8284-4218-848a-d969914c88d4" containerName="registry-server" Feb 19 22:04:22 crc kubenswrapper[4795]: E0219 22:04:22.118682 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fba4b2-8284-4218-848a-d969914c88d4" containerName="extract-utilities" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.118689 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fba4b2-8284-4218-848a-d969914c88d4" containerName="extract-utilities" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.118874 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="81fba4b2-8284-4218-848a-d969914c88d4" containerName="registry-server" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.118900 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e21c86a5-45d5-4a66-ab91-2c5f63ed9560" containerName="registry-server" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.120108 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.142752 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m84kt"] Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.252765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-utilities\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.252842 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-catalog-content\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.252926 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc4r6\" (UniqueName: \"kubernetes.io/projected/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-kube-api-access-bc4r6\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.358168 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-utilities\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.358602 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-catalog-content\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.358654 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc4r6\" (UniqueName: \"kubernetes.io/projected/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-kube-api-access-bc4r6\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.358730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-utilities\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.359031 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-catalog-content\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.384855 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc4r6\" (UniqueName: \"kubernetes.io/projected/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-kube-api-access-bc4r6\") pod \"certified-operators-m84kt\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.456399 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:22 crc kubenswrapper[4795]: I0219 22:04:22.908434 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m84kt"] Feb 19 22:04:23 crc kubenswrapper[4795]: I0219 22:04:23.399717 4795 generic.go:334] "Generic (PLEG): container finished" podID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerID="0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7" exitCode=0 Feb 19 22:04:23 crc kubenswrapper[4795]: I0219 22:04:23.399777 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m84kt" event={"ID":"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd","Type":"ContainerDied","Data":"0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7"} Feb 19 22:04:23 crc kubenswrapper[4795]: I0219 22:04:23.399816 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m84kt" event={"ID":"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd","Type":"ContainerStarted","Data":"bd7076d5f999cd2836934242f282bd72d95dbb6690558499c8b183f067011a71"} Feb 19 22:04:24 crc kubenswrapper[4795]: I0219 22:04:24.431321 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m84kt" event={"ID":"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd","Type":"ContainerStarted","Data":"007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571"} Feb 19 22:04:25 crc kubenswrapper[4795]: I0219 22:04:25.440472 4795 generic.go:334] "Generic (PLEG): container finished" podID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerID="007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571" exitCode=0 Feb 19 22:04:25 crc kubenswrapper[4795]: I0219 22:04:25.440523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m84kt" event={"ID":"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd","Type":"ContainerDied","Data":"007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571"} Feb 19 22:04:26 crc kubenswrapper[4795]: I0219 22:04:26.449918 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m84kt" event={"ID":"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd","Type":"ContainerStarted","Data":"14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2"} Feb 19 22:04:26 crc kubenswrapper[4795]: I0219 22:04:26.471801 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m84kt" podStartSLOduration=2.023590685 podStartE2EDuration="4.471779389s" podCreationTimestamp="2026-02-19 22:04:22 +0000 UTC" firstStartedPulling="2026-02-19 22:04:23.401931813 +0000 UTC m=+2174.594449717" lastFinishedPulling="2026-02-19 22:04:25.850120547 +0000 UTC m=+2177.042638421" observedRunningTime="2026-02-19 22:04:26.468012565 +0000 UTC m=+2177.660530439" watchObservedRunningTime="2026-02-19 22:04:26.471779389 +0000 UTC m=+2177.664297263" Feb 19 22:04:28 crc kubenswrapper[4795]: I0219 22:04:28.427815 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:04:28 crc kubenswrapper[4795]: I0219 22:04:28.428118 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:04:32 crc kubenswrapper[4795]: I0219 22:04:32.456852 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:32 crc kubenswrapper[4795]: I0219 22:04:32.457298 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:32 crc kubenswrapper[4795]: I0219 22:04:32.510701 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:32 crc kubenswrapper[4795]: I0219 22:04:32.568150 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:32 crc kubenswrapper[4795]: I0219 22:04:32.751393 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m84kt"] Feb 19 22:04:34 crc kubenswrapper[4795]: I0219 22:04:34.524608 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m84kt" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerName="registry-server" containerID="cri-o://14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2" gracePeriod=2 Feb 19 22:04:34 crc kubenswrapper[4795]: I0219 22:04:34.931310 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.050367 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc4r6\" (UniqueName: \"kubernetes.io/projected/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-kube-api-access-bc4r6\") pod \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.050414 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-utilities\") pod \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.050464 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-catalog-content\") pod \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\" (UID: \"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd\") " Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.051700 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-utilities" (OuterVolumeSpecName: "utilities") pod "5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" (UID: "5cf90d2b-fcf4-4082-8bb2-892e23a85cfd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.070205 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-kube-api-access-bc4r6" (OuterVolumeSpecName: "kube-api-access-bc4r6") pod "5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" (UID: "5cf90d2b-fcf4-4082-8bb2-892e23a85cfd"). InnerVolumeSpecName "kube-api-access-bc4r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.152241 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc4r6\" (UniqueName: \"kubernetes.io/projected/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-kube-api-access-bc4r6\") on node \"crc\" DevicePath \"\"" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.152511 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.410087 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" (UID: "5cf90d2b-fcf4-4082-8bb2-892e23a85cfd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.456461 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.536086 4795 generic.go:334] "Generic (PLEG): container finished" podID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerID="14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2" exitCode=0 Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.536149 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m84kt" event={"ID":"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd","Type":"ContainerDied","Data":"14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2"} Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.536207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m84kt" event={"ID":"5cf90d2b-fcf4-4082-8bb2-892e23a85cfd","Type":"ContainerDied","Data":"bd7076d5f999cd2836934242f282bd72d95dbb6690558499c8b183f067011a71"} Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.536234 4795 scope.go:117] "RemoveContainer" containerID="14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.536117 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m84kt" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.567130 4795 scope.go:117] "RemoveContainer" containerID="007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.568081 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m84kt"] Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.580952 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m84kt"] Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.600555 4795 scope.go:117] "RemoveContainer" containerID="0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.625080 4795 scope.go:117] "RemoveContainer" containerID="14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2" Feb 19 22:04:35 crc kubenswrapper[4795]: E0219 22:04:35.625680 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2\": container with ID starting with 14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2 not found: ID does not exist" containerID="14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.625710 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2"} err="failed to get container status \"14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2\": rpc error: code = NotFound desc = could not find container \"14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2\": container with ID starting with 14fb0f3a25fe9c61e1fe3b83081602db8ffdc377387580ff45e90f34bd7a9df2 not found: ID does not exist" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.625736 4795 scope.go:117] "RemoveContainer" containerID="007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571" Feb 19 22:04:35 crc kubenswrapper[4795]: E0219 22:04:35.626080 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571\": container with ID starting with 007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571 not found: ID does not exist" containerID="007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.626116 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571"} err="failed to get container status \"007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571\": rpc error: code = NotFound desc = could not find container \"007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571\": container with ID starting with 007dfd5b6719160dec98fc1f3e0143ffe486e3ffcc4e053f3ae8f4c4d442d571 not found: ID does not exist" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.626138 4795 scope.go:117] "RemoveContainer" containerID="0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7" Feb 19 22:04:35 crc kubenswrapper[4795]: E0219 22:04:35.626515 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7\": container with ID starting with 0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7 not found: ID does not exist" containerID="0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7" Feb 19 22:04:35 crc kubenswrapper[4795]: I0219 22:04:35.626536 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7"} err="failed to get container status \"0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7\": rpc error: code = NotFound desc = could not find container \"0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7\": container with ID starting with 0c995342e1a665caa9965f136f59b3e69da438e060c8e9efffdf6e2b986ba8b7 not found: ID does not exist" Feb 19 22:04:37 crc kubenswrapper[4795]: I0219 22:04:37.529403 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" path="/var/lib/kubelet/pods/5cf90d2b-fcf4-4082-8bb2-892e23a85cfd/volumes" Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.428278 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.429118 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.429231 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.430095 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.430238 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" gracePeriod=600 Feb 19 22:04:58 crc kubenswrapper[4795]: E0219 22:04:58.563997 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.730001 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" exitCode=0 Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.730049 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0"} Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.730084 4795 scope.go:117] "RemoveContainer" containerID="6c01aaf6e3d8e8204ba05a7b4dc2e4ab1c48baf63fa06a4eaffcb4f1cd336768" Feb 19 22:04:58 crc kubenswrapper[4795]: I0219 22:04:58.730533 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:04:58 crc kubenswrapper[4795]: E0219 22:04:58.730762 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:05:12 crc kubenswrapper[4795]: I0219 22:05:12.511685 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:05:12 crc kubenswrapper[4795]: E0219 22:05:12.512232 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:05:24 crc kubenswrapper[4795]: I0219 22:05:24.512152 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:05:24 crc kubenswrapper[4795]: E0219 22:05:24.513027 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:05:36 crc kubenswrapper[4795]: I0219 22:05:36.511541 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:05:36 crc kubenswrapper[4795]: E0219 22:05:36.512295 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:05:49 crc kubenswrapper[4795]: I0219 22:05:49.511991 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:05:49 crc kubenswrapper[4795]: E0219 22:05:49.512729 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:06:00 crc kubenswrapper[4795]: I0219 22:06:00.511623 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:06:00 crc kubenswrapper[4795]: E0219 22:06:00.512358 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:06:11 crc kubenswrapper[4795]: I0219 22:06:11.512112 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:06:11 crc kubenswrapper[4795]: E0219 22:06:11.512866 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:06:23 crc kubenswrapper[4795]: I0219 22:06:23.513154 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:06:23 crc kubenswrapper[4795]: E0219 22:06:23.513922 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:06:35 crc kubenswrapper[4795]: I0219 22:06:35.511743 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:06:35 crc kubenswrapper[4795]: E0219 22:06:35.512555 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:06:47 crc kubenswrapper[4795]: I0219 22:06:47.511631 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:06:47 crc kubenswrapper[4795]: E0219 22:06:47.512519 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:07:00 crc kubenswrapper[4795]: I0219 22:07:00.248494 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-hxxkd" podUID="5a3a8d91-b500-48db-9ceb-cc105b2eeb3a" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 22:07:00 crc kubenswrapper[4795]: I0219 22:07:00.276063 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:07:00 crc kubenswrapper[4795]: E0219 22:07:00.276557 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:07:10 crc kubenswrapper[4795]: I0219 22:07:10.513214 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:07:10 crc kubenswrapper[4795]: E0219 22:07:10.514010 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:07:23 crc kubenswrapper[4795]: I0219 22:07:23.513182 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:07:23 crc kubenswrapper[4795]: E0219 22:07:23.513895 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:07:35 crc kubenswrapper[4795]: I0219 22:07:35.512449 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:07:35 crc kubenswrapper[4795]: E0219 22:07:35.513342 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:07:46 crc kubenswrapper[4795]: I0219 22:07:46.511851 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:07:46 crc kubenswrapper[4795]: E0219 22:07:46.512641 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:07:58 crc kubenswrapper[4795]: I0219 22:07:58.512067 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:07:58 crc kubenswrapper[4795]: E0219 22:07:58.512983 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:08:09 crc kubenswrapper[4795]: I0219 22:08:09.521054 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:08:09 crc kubenswrapper[4795]: E0219 22:08:09.521796 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:08:23 crc kubenswrapper[4795]: I0219 22:08:23.511521 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:08:23 crc kubenswrapper[4795]: E0219 22:08:23.512481 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:08:34 crc kubenswrapper[4795]: I0219 22:08:34.512679 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:08:34 crc kubenswrapper[4795]: E0219 22:08:34.513545 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:08:48 crc kubenswrapper[4795]: I0219 22:08:48.512263 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:08:48 crc kubenswrapper[4795]: E0219 22:08:48.514972 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:09:01 crc kubenswrapper[4795]: I0219 22:09:01.514221 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:09:01 crc kubenswrapper[4795]: E0219 22:09:01.517459 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:09:16 crc kubenswrapper[4795]: I0219 22:09:16.512284 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:09:16 crc kubenswrapper[4795]: E0219 22:09:16.512987 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:09:28 crc kubenswrapper[4795]: I0219 22:09:28.512616 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:09:28 crc kubenswrapper[4795]: E0219 22:09:28.513859 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:09:39 crc kubenswrapper[4795]: I0219 22:09:39.521098 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:09:39 crc kubenswrapper[4795]: E0219 22:09:39.522694 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:09:51 crc kubenswrapper[4795]: I0219 22:09:51.511335 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:09:51 crc kubenswrapper[4795]: E0219 22:09:51.512069 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.802019 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ngq7m"] Feb 19 22:10:02 crc kubenswrapper[4795]: E0219 22:10:02.806497 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerName="extract-utilities" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.806528 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerName="extract-utilities" Feb 19 22:10:02 crc kubenswrapper[4795]: E0219 22:10:02.806540 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerName="extract-content" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.806547 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerName="extract-content" Feb 19 22:10:02 crc kubenswrapper[4795]: E0219 22:10:02.806565 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerName="registry-server" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.806571 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerName="registry-server" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.806708 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cf90d2b-fcf4-4082-8bb2-892e23a85cfd" containerName="registry-server" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.807647 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.813620 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngq7m"] Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.973759 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds46x\" (UniqueName: \"kubernetes.io/projected/7d37836f-dda1-46ea-8bd3-46b3d1e40115-kube-api-access-ds46x\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.973872 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-catalog-content\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:02 crc kubenswrapper[4795]: I0219 22:10:02.973897 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-utilities\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:03 crc kubenswrapper[4795]: I0219 22:10:03.076065 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-catalog-content\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:03 crc kubenswrapper[4795]: I0219 22:10:03.076146 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-utilities\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:03 crc kubenswrapper[4795]: I0219 22:10:03.076233 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds46x\" (UniqueName: \"kubernetes.io/projected/7d37836f-dda1-46ea-8bd3-46b3d1e40115-kube-api-access-ds46x\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:03 crc kubenswrapper[4795]: I0219 22:10:03.076813 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-utilities\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:03 crc kubenswrapper[4795]: I0219 22:10:03.077024 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-catalog-content\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:03 crc kubenswrapper[4795]: I0219 22:10:03.094957 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds46x\" (UniqueName: \"kubernetes.io/projected/7d37836f-dda1-46ea-8bd3-46b3d1e40115-kube-api-access-ds46x\") pod \"redhat-operators-ngq7m\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:03 crc kubenswrapper[4795]: I0219 22:10:03.136924 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:03 crc kubenswrapper[4795]: I0219 22:10:03.968909 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngq7m"] Feb 19 22:10:04 crc kubenswrapper[4795]: I0219 22:10:04.593018 4795 generic.go:334] "Generic (PLEG): container finished" podID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerID="0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb" exitCode=0 Feb 19 22:10:04 crc kubenswrapper[4795]: I0219 22:10:04.593239 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngq7m" event={"ID":"7d37836f-dda1-46ea-8bd3-46b3d1e40115","Type":"ContainerDied","Data":"0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb"} Feb 19 22:10:04 crc kubenswrapper[4795]: I0219 22:10:04.593389 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngq7m" event={"ID":"7d37836f-dda1-46ea-8bd3-46b3d1e40115","Type":"ContainerStarted","Data":"46c880d5e4e646af453acd759e46c0247571e60c4ac2e1e243dece3a82b42d6d"} Feb 19 22:10:04 crc kubenswrapper[4795]: I0219 22:10:04.594540 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:10:06 crc kubenswrapper[4795]: I0219 22:10:06.518055 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:10:07 crc kubenswrapper[4795]: I0219 22:10:07.612937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"dae87592e9f8f7d3ce64c7cc5ffb3429ecd2b708a654e97c0a7e99d0113e7b2c"} Feb 19 22:10:08 crc kubenswrapper[4795]: I0219 22:10:08.621089 4795 generic.go:334] "Generic (PLEG): container finished" podID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerID="ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43" exitCode=0 Feb 19 22:10:08 crc kubenswrapper[4795]: I0219 22:10:08.621206 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngq7m" event={"ID":"7d37836f-dda1-46ea-8bd3-46b3d1e40115","Type":"ContainerDied","Data":"ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43"} Feb 19 22:10:09 crc kubenswrapper[4795]: I0219 22:10:09.632418 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngq7m" event={"ID":"7d37836f-dda1-46ea-8bd3-46b3d1e40115","Type":"ContainerStarted","Data":"0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e"} Feb 19 22:10:09 crc kubenswrapper[4795]: I0219 22:10:09.664209 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ngq7m" podStartSLOduration=3.01183413 podStartE2EDuration="7.66419235s" podCreationTimestamp="2026-02-19 22:10:02 +0000 UTC" firstStartedPulling="2026-02-19 22:10:04.594306978 +0000 UTC m=+2515.786824842" lastFinishedPulling="2026-02-19 22:10:09.246665198 +0000 UTC m=+2520.439183062" observedRunningTime="2026-02-19 22:10:09.657675639 +0000 UTC m=+2520.850193523" watchObservedRunningTime="2026-02-19 22:10:09.66419235 +0000 UTC m=+2520.856710214" Feb 19 22:10:13 crc kubenswrapper[4795]: I0219 22:10:13.137966 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:13 crc kubenswrapper[4795]: I0219 22:10:13.138364 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:14 crc kubenswrapper[4795]: I0219 22:10:14.181952 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ngq7m" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="registry-server" probeResult="failure" output=< Feb 19 22:10:14 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 22:10:14 crc kubenswrapper[4795]: > Feb 19 22:10:23 crc kubenswrapper[4795]: I0219 22:10:23.186040 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:23 crc kubenswrapper[4795]: I0219 22:10:23.238635 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:23 crc kubenswrapper[4795]: I0219 22:10:23.414946 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngq7m"] Feb 19 22:10:24 crc kubenswrapper[4795]: I0219 22:10:24.745902 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ngq7m" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="registry-server" containerID="cri-o://0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e" gracePeriod=2 Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.148018 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.299189 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-catalog-content\") pod \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.299389 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-utilities\") pod \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.299427 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds46x\" (UniqueName: \"kubernetes.io/projected/7d37836f-dda1-46ea-8bd3-46b3d1e40115-kube-api-access-ds46x\") pod \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\" (UID: \"7d37836f-dda1-46ea-8bd3-46b3d1e40115\") " Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.300521 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-utilities" (OuterVolumeSpecName: "utilities") pod "7d37836f-dda1-46ea-8bd3-46b3d1e40115" (UID: "7d37836f-dda1-46ea-8bd3-46b3d1e40115"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.305376 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d37836f-dda1-46ea-8bd3-46b3d1e40115-kube-api-access-ds46x" (OuterVolumeSpecName: "kube-api-access-ds46x") pod "7d37836f-dda1-46ea-8bd3-46b3d1e40115" (UID: "7d37836f-dda1-46ea-8bd3-46b3d1e40115"). InnerVolumeSpecName "kube-api-access-ds46x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.401385 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.401428 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds46x\" (UniqueName: \"kubernetes.io/projected/7d37836f-dda1-46ea-8bd3-46b3d1e40115-kube-api-access-ds46x\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.429487 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d37836f-dda1-46ea-8bd3-46b3d1e40115" (UID: "7d37836f-dda1-46ea-8bd3-46b3d1e40115"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.503578 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d37836f-dda1-46ea-8bd3-46b3d1e40115-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.758824 4795 generic.go:334] "Generic (PLEG): container finished" podID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerID="0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e" exitCode=0 Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.758881 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngq7m" event={"ID":"7d37836f-dda1-46ea-8bd3-46b3d1e40115","Type":"ContainerDied","Data":"0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e"} Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.758897 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngq7m" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.758916 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngq7m" event={"ID":"7d37836f-dda1-46ea-8bd3-46b3d1e40115","Type":"ContainerDied","Data":"46c880d5e4e646af453acd759e46c0247571e60c4ac2e1e243dece3a82b42d6d"} Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.758942 4795 scope.go:117] "RemoveContainer" containerID="0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.791439 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngq7m"] Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.800476 4795 scope.go:117] "RemoveContainer" containerID="ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.802910 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ngq7m"] Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.831485 4795 scope.go:117] "RemoveContainer" containerID="0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.844963 4795 scope.go:117] "RemoveContainer" containerID="0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e" Feb 19 22:10:25 crc kubenswrapper[4795]: E0219 22:10:25.845430 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e\": container with ID starting with 0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e not found: ID does not exist" containerID="0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.845482 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e"} err="failed to get container status \"0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e\": rpc error: code = NotFound desc = could not find container \"0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e\": container with ID starting with 0b8a4c25ac1794a48e077962e5f430b0a67013ac3a2f8e66d1ac424739506f6e not found: ID does not exist" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.845507 4795 scope.go:117] "RemoveContainer" containerID="ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43" Feb 19 22:10:25 crc kubenswrapper[4795]: E0219 22:10:25.845906 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43\": container with ID starting with ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43 not found: ID does not exist" containerID="ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.845965 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43"} err="failed to get container status \"ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43\": rpc error: code = NotFound desc = could not find container \"ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43\": container with ID starting with ae775cbe4f577ef72803181a6ba2b4bab0a5cc7b7f8d76afd9f73fc2df207a43 not found: ID does not exist" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.845997 4795 scope.go:117] "RemoveContainer" containerID="0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb" Feb 19 22:10:25 crc kubenswrapper[4795]: E0219 22:10:25.846335 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb\": container with ID starting with 0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb not found: ID does not exist" containerID="0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb" Feb 19 22:10:25 crc kubenswrapper[4795]: I0219 22:10:25.846363 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb"} err="failed to get container status \"0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb\": rpc error: code = NotFound desc = could not find container \"0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb\": container with ID starting with 0f19328301f5eb25ceb7841e018e2094ea8c1ef435e6b04075ef5e41263da0fb not found: ID does not exist" Feb 19 22:10:27 crc kubenswrapper[4795]: I0219 22:10:27.523145 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" path="/var/lib/kubelet/pods/7d37836f-dda1-46ea-8bd3-46b3d1e40115/volumes" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.621034 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ctpsf"] Feb 19 22:10:36 crc kubenswrapper[4795]: E0219 22:10:36.622055 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="extract-utilities" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.622075 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="extract-utilities" Feb 19 22:10:36 crc kubenswrapper[4795]: E0219 22:10:36.622097 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="extract-content" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.622106 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="extract-content" Feb 19 22:10:36 crc kubenswrapper[4795]: E0219 22:10:36.622126 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="registry-server" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.622134 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="registry-server" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.622322 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d37836f-dda1-46ea-8bd3-46b3d1e40115" containerName="registry-server" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.623572 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.637688 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctpsf"] Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.763614 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7zms\" (UniqueName: \"kubernetes.io/projected/9215f5c9-f305-43b3-8e82-902a494f07d9-kube-api-access-w7zms\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.763904 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-utilities\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.764005 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-catalog-content\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.865812 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7zms\" (UniqueName: \"kubernetes.io/projected/9215f5c9-f305-43b3-8e82-902a494f07d9-kube-api-access-w7zms\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.866114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-utilities\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.866275 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-catalog-content\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.866820 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-utilities\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.866919 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-catalog-content\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.897319 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7zms\" (UniqueName: \"kubernetes.io/projected/9215f5c9-f305-43b3-8e82-902a494f07d9-kube-api-access-w7zms\") pod \"redhat-marketplace-ctpsf\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:36 crc kubenswrapper[4795]: I0219 22:10:36.940874 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:37 crc kubenswrapper[4795]: I0219 22:10:37.154046 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctpsf"] Feb 19 22:10:37 crc kubenswrapper[4795]: I0219 22:10:37.860279 4795 generic.go:334] "Generic (PLEG): container finished" podID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerID="9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225" exitCode=0 Feb 19 22:10:37 crc kubenswrapper[4795]: I0219 22:10:37.860326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctpsf" event={"ID":"9215f5c9-f305-43b3-8e82-902a494f07d9","Type":"ContainerDied","Data":"9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225"} Feb 19 22:10:37 crc kubenswrapper[4795]: I0219 22:10:37.860360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctpsf" event={"ID":"9215f5c9-f305-43b3-8e82-902a494f07d9","Type":"ContainerStarted","Data":"c62b34ad6fb32b473e03faa814eef441291609a20f868aa6c180ac55b5240145"} Feb 19 22:10:39 crc kubenswrapper[4795]: I0219 22:10:39.875958 4795 generic.go:334] "Generic (PLEG): container finished" podID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerID="81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7" exitCode=0 Feb 19 22:10:39 crc kubenswrapper[4795]: I0219 22:10:39.876028 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctpsf" event={"ID":"9215f5c9-f305-43b3-8e82-902a494f07d9","Type":"ContainerDied","Data":"81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7"} Feb 19 22:10:40 crc kubenswrapper[4795]: I0219 22:10:40.884899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctpsf" event={"ID":"9215f5c9-f305-43b3-8e82-902a494f07d9","Type":"ContainerStarted","Data":"bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e"} Feb 19 22:10:40 crc kubenswrapper[4795]: I0219 22:10:40.911640 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ctpsf" podStartSLOduration=2.522363009 podStartE2EDuration="4.911620683s" podCreationTimestamp="2026-02-19 22:10:36 +0000 UTC" firstStartedPulling="2026-02-19 22:10:37.861869188 +0000 UTC m=+2549.054387062" lastFinishedPulling="2026-02-19 22:10:40.251126872 +0000 UTC m=+2551.443644736" observedRunningTime="2026-02-19 22:10:40.90269146 +0000 UTC m=+2552.095209324" watchObservedRunningTime="2026-02-19 22:10:40.911620683 +0000 UTC m=+2552.104138547" Feb 19 22:10:46 crc kubenswrapper[4795]: I0219 22:10:46.941538 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:46 crc kubenswrapper[4795]: I0219 22:10:46.942021 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:46 crc kubenswrapper[4795]: I0219 22:10:46.988010 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:47 crc kubenswrapper[4795]: I0219 22:10:47.984857 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:48 crc kubenswrapper[4795]: I0219 22:10:48.027509 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctpsf"] Feb 19 22:10:49 crc kubenswrapper[4795]: I0219 22:10:49.943037 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ctpsf" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerName="registry-server" containerID="cri-o://bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e" gracePeriod=2 Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.408084 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.573663 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-utilities\") pod \"9215f5c9-f305-43b3-8e82-902a494f07d9\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.573734 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7zms\" (UniqueName: \"kubernetes.io/projected/9215f5c9-f305-43b3-8e82-902a494f07d9-kube-api-access-w7zms\") pod \"9215f5c9-f305-43b3-8e82-902a494f07d9\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.573782 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-catalog-content\") pod \"9215f5c9-f305-43b3-8e82-902a494f07d9\" (UID: \"9215f5c9-f305-43b3-8e82-902a494f07d9\") " Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.574663 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-utilities" (OuterVolumeSpecName: "utilities") pod "9215f5c9-f305-43b3-8e82-902a494f07d9" (UID: "9215f5c9-f305-43b3-8e82-902a494f07d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.582082 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9215f5c9-f305-43b3-8e82-902a494f07d9-kube-api-access-w7zms" (OuterVolumeSpecName: "kube-api-access-w7zms") pod "9215f5c9-f305-43b3-8e82-902a494f07d9" (UID: "9215f5c9-f305-43b3-8e82-902a494f07d9"). InnerVolumeSpecName "kube-api-access-w7zms". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.616806 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9215f5c9-f305-43b3-8e82-902a494f07d9" (UID: "9215f5c9-f305-43b3-8e82-902a494f07d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.675149 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.675259 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7zms\" (UniqueName: \"kubernetes.io/projected/9215f5c9-f305-43b3-8e82-902a494f07d9-kube-api-access-w7zms\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.675279 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9215f5c9-f305-43b3-8e82-902a494f07d9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.953733 4795 generic.go:334] "Generic (PLEG): container finished" podID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerID="bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e" exitCode=0 Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.953797 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctpsf" event={"ID":"9215f5c9-f305-43b3-8e82-902a494f07d9","Type":"ContainerDied","Data":"bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e"} Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.953811 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ctpsf" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.953851 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ctpsf" event={"ID":"9215f5c9-f305-43b3-8e82-902a494f07d9","Type":"ContainerDied","Data":"c62b34ad6fb32b473e03faa814eef441291609a20f868aa6c180ac55b5240145"} Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.953938 4795 scope.go:117] "RemoveContainer" containerID="bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.992433 4795 scope.go:117] "RemoveContainer" containerID="81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7" Feb 19 22:10:50 crc kubenswrapper[4795]: I0219 22:10:50.993546 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctpsf"] Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.018736 4795 scope.go:117] "RemoveContainer" containerID="9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225" Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.027816 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ctpsf"] Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.050512 4795 scope.go:117] "RemoveContainer" containerID="bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e" Feb 19 22:10:51 crc kubenswrapper[4795]: E0219 22:10:51.051193 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e\": container with ID starting with bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e not found: ID does not exist" containerID="bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e" Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.051246 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e"} err="failed to get container status \"bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e\": rpc error: code = NotFound desc = could not find container \"bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e\": container with ID starting with bd2ce00fe78a824703dd1a02537a427d4a590b0beebf0fc2f40bee35018e5f7e not found: ID does not exist" Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.051276 4795 scope.go:117] "RemoveContainer" containerID="81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7" Feb 19 22:10:51 crc kubenswrapper[4795]: E0219 22:10:51.051729 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7\": container with ID starting with 81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7 not found: ID does not exist" containerID="81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7" Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.051767 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7"} err="failed to get container status \"81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7\": rpc error: code = NotFound desc = could not find container \"81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7\": container with ID starting with 81ad6355095939f2fc302dfa0385b5cf46b203abd83546d96e07b8003f8ff0d7 not found: ID does not exist" Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.051791 4795 scope.go:117] "RemoveContainer" containerID="9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225" Feb 19 22:10:51 crc kubenswrapper[4795]: E0219 22:10:51.052107 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225\": container with ID starting with 9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225 not found: ID does not exist" containerID="9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225" Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.052137 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225"} err="failed to get container status \"9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225\": rpc error: code = NotFound desc = could not find container \"9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225\": container with ID starting with 9642bb6b6fd3fddface433a98f740926eb45e711ae4821ecbeae7f50f1f18225 not found: ID does not exist" Feb 19 22:10:51 crc kubenswrapper[4795]: I0219 22:10:51.522072 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" path="/var/lib/kubelet/pods/9215f5c9-f305-43b3-8e82-902a494f07d9/volumes" Feb 19 22:11:23 crc kubenswrapper[4795]: I0219 22:11:23.872283 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x7hcn"] Feb 19 22:11:23 crc kubenswrapper[4795]: E0219 22:11:23.873355 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerName="extract-utilities" Feb 19 22:11:23 crc kubenswrapper[4795]: I0219 22:11:23.873391 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerName="extract-utilities" Feb 19 22:11:23 crc kubenswrapper[4795]: E0219 22:11:23.873403 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerName="extract-content" Feb 19 22:11:23 crc kubenswrapper[4795]: I0219 22:11:23.873410 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerName="extract-content" Feb 19 22:11:23 crc kubenswrapper[4795]: E0219 22:11:23.873425 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerName="registry-server" Feb 19 22:11:23 crc kubenswrapper[4795]: I0219 22:11:23.873431 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerName="registry-server" Feb 19 22:11:23 crc kubenswrapper[4795]: I0219 22:11:23.873558 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9215f5c9-f305-43b3-8e82-902a494f07d9" containerName="registry-server" Feb 19 22:11:23 crc kubenswrapper[4795]: I0219 22:11:23.874619 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:23 crc kubenswrapper[4795]: I0219 22:11:23.894961 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x7hcn"] Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.021701 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-catalog-content\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.021750 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-utilities\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.021881 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6774\" (UniqueName: \"kubernetes.io/projected/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-kube-api-access-p6774\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.123267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-catalog-content\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.123561 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-utilities\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.123686 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6774\" (UniqueName: \"kubernetes.io/projected/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-kube-api-access-p6774\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.123882 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-catalog-content\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.124054 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-utilities\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.146405 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6774\" (UniqueName: \"kubernetes.io/projected/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-kube-api-access-p6774\") pod \"community-operators-x7hcn\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.202151 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:24 crc kubenswrapper[4795]: I0219 22:11:24.683009 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x7hcn"] Feb 19 22:11:25 crc kubenswrapper[4795]: E0219 22:11:25.463338 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5828c8e_eae0_448e_882d_fc02dc4ec6bb.slice/crio-conmon-0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973.scope\": RecentStats: unable to find data in memory cache]" Feb 19 22:11:25 crc kubenswrapper[4795]: I0219 22:11:25.490192 4795 generic.go:334] "Generic (PLEG): container finished" podID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerID="0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973" exitCode=0 Feb 19 22:11:25 crc kubenswrapper[4795]: I0219 22:11:25.490240 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7hcn" event={"ID":"e5828c8e-eae0-448e-882d-fc02dc4ec6bb","Type":"ContainerDied","Data":"0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973"} Feb 19 22:11:25 crc kubenswrapper[4795]: I0219 22:11:25.490270 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7hcn" event={"ID":"e5828c8e-eae0-448e-882d-fc02dc4ec6bb","Type":"ContainerStarted","Data":"151baa46fb777750b92c9310f491b67b5f5cb18c83876a635d6951907b4bc717"} Feb 19 22:11:27 crc kubenswrapper[4795]: I0219 22:11:27.957722 4795 patch_prober.go:28] interesting pod/route-controller-manager-84d5f88f56-vjpj2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 22:11:27 crc kubenswrapper[4795]: I0219 22:11:27.958132 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" podUID="8e80efae-f1ac-40f9-ad38-61dc2821499e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 22:11:27 crc kubenswrapper[4795]: I0219 22:11:27.972009 4795 patch_prober.go:28] interesting pod/route-controller-manager-84d5f88f56-vjpj2 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 22:11:27 crc kubenswrapper[4795]: I0219 22:11:27.972066 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-84d5f88f56-vjpj2" podUID="8e80efae-f1ac-40f9-ad38-61dc2821499e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 22:11:30 crc kubenswrapper[4795]: I0219 22:11:30.188383 4795 generic.go:334] "Generic (PLEG): container finished" podID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerID="0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2" exitCode=0 Feb 19 22:11:30 crc kubenswrapper[4795]: I0219 22:11:30.188469 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7hcn" event={"ID":"e5828c8e-eae0-448e-882d-fc02dc4ec6bb","Type":"ContainerDied","Data":"0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2"} Feb 19 22:11:31 crc kubenswrapper[4795]: I0219 22:11:31.198424 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7hcn" event={"ID":"e5828c8e-eae0-448e-882d-fc02dc4ec6bb","Type":"ContainerStarted","Data":"dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013"} Feb 19 22:11:31 crc kubenswrapper[4795]: I0219 22:11:31.228306 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x7hcn" podStartSLOduration=3.168102005 podStartE2EDuration="8.228280534s" podCreationTimestamp="2026-02-19 22:11:23 +0000 UTC" firstStartedPulling="2026-02-19 22:11:25.492439423 +0000 UTC m=+2596.684957287" lastFinishedPulling="2026-02-19 22:11:30.552617942 +0000 UTC m=+2601.745135816" observedRunningTime="2026-02-19 22:11:31.224948129 +0000 UTC m=+2602.417466023" watchObservedRunningTime="2026-02-19 22:11:31.228280534 +0000 UTC m=+2602.420798428" Feb 19 22:11:34 crc kubenswrapper[4795]: I0219 22:11:34.202271 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:34 crc kubenswrapper[4795]: I0219 22:11:34.202646 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:34 crc kubenswrapper[4795]: I0219 22:11:34.263681 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.251285 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.314826 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x7hcn"] Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.328719 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x7hcn" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerName="registry-server" containerID="cri-o://dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013" gracePeriod=2 Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.743693 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.891434 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-catalog-content\") pod \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.891503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6774\" (UniqueName: \"kubernetes.io/projected/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-kube-api-access-p6774\") pod \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.891539 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-utilities\") pod \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\" (UID: \"e5828c8e-eae0-448e-882d-fc02dc4ec6bb\") " Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.892603 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-utilities" (OuterVolumeSpecName: "utilities") pod "e5828c8e-eae0-448e-882d-fc02dc4ec6bb" (UID: "e5828c8e-eae0-448e-882d-fc02dc4ec6bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.897261 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-kube-api-access-p6774" (OuterVolumeSpecName: "kube-api-access-p6774") pod "e5828c8e-eae0-448e-882d-fc02dc4ec6bb" (UID: "e5828c8e-eae0-448e-882d-fc02dc4ec6bb"). InnerVolumeSpecName "kube-api-access-p6774". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.950901 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5828c8e-eae0-448e-882d-fc02dc4ec6bb" (UID: "e5828c8e-eae0-448e-882d-fc02dc4ec6bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.993367 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.993420 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6774\" (UniqueName: \"kubernetes.io/projected/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-kube-api-access-p6774\") on node \"crc\" DevicePath \"\"" Feb 19 22:11:44 crc kubenswrapper[4795]: I0219 22:11:44.993449 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5828c8e-eae0-448e-882d-fc02dc4ec6bb-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.342289 4795 generic.go:334] "Generic (PLEG): container finished" podID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerID="dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013" exitCode=0 Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.342332 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7hcn" event={"ID":"e5828c8e-eae0-448e-882d-fc02dc4ec6bb","Type":"ContainerDied","Data":"dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013"} Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.342363 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x7hcn" event={"ID":"e5828c8e-eae0-448e-882d-fc02dc4ec6bb","Type":"ContainerDied","Data":"151baa46fb777750b92c9310f491b67b5f5cb18c83876a635d6951907b4bc717"} Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.342379 4795 scope.go:117] "RemoveContainer" containerID="dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.342398 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x7hcn" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.374847 4795 scope.go:117] "RemoveContainer" containerID="0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.398652 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x7hcn"] Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.407908 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x7hcn"] Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.415917 4795 scope.go:117] "RemoveContainer" containerID="0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.437478 4795 scope.go:117] "RemoveContainer" containerID="dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013" Feb 19 22:11:45 crc kubenswrapper[4795]: E0219 22:11:45.437935 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013\": container with ID starting with dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013 not found: ID does not exist" containerID="dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.437985 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013"} err="failed to get container status \"dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013\": rpc error: code = NotFound desc = could not find container \"dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013\": container with ID starting with dbfbe6cde84e5fb7cb2cd7fde30ec03735fe87ac29ed54b70f882321443f8013 not found: ID does not exist" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.438023 4795 scope.go:117] "RemoveContainer" containerID="0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2" Feb 19 22:11:45 crc kubenswrapper[4795]: E0219 22:11:45.438389 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2\": container with ID starting with 0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2 not found: ID does not exist" containerID="0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.438419 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2"} err="failed to get container status \"0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2\": rpc error: code = NotFound desc = could not find container \"0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2\": container with ID starting with 0ce80f2be267d708c1f1a4d0b961a7253a7f1358ae9be8f0d714ff3ea01023d2 not found: ID does not exist" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.438437 4795 scope.go:117] "RemoveContainer" containerID="0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973" Feb 19 22:11:45 crc kubenswrapper[4795]: E0219 22:11:45.438755 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973\": container with ID starting with 0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973 not found: ID does not exist" containerID="0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.438847 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973"} err="failed to get container status \"0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973\": rpc error: code = NotFound desc = could not find container \"0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973\": container with ID starting with 0f132ae36e43ad77380b97912b0ea58ef69b6271ceadb36c722ba86b44b0b973 not found: ID does not exist" Feb 19 22:11:45 crc kubenswrapper[4795]: I0219 22:11:45.531810 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" path="/var/lib/kubelet/pods/e5828c8e-eae0-448e-882d-fc02dc4ec6bb/volumes" Feb 19 22:12:28 crc kubenswrapper[4795]: I0219 22:12:28.427828 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:12:28 crc kubenswrapper[4795]: I0219 22:12:28.428500 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:12:58 crc kubenswrapper[4795]: I0219 22:12:58.427635 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:12:58 crc kubenswrapper[4795]: I0219 22:12:58.428390 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.427210 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.427827 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.427880 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.428561 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dae87592e9f8f7d3ce64c7cc5ffb3429ecd2b708a654e97c0a7e99d0113e7b2c"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.428630 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://dae87592e9f8f7d3ce64c7cc5ffb3429ecd2b708a654e97c0a7e99d0113e7b2c" gracePeriod=600 Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.928376 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="dae87592e9f8f7d3ce64c7cc5ffb3429ecd2b708a654e97c0a7e99d0113e7b2c" exitCode=0 Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.928628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"dae87592e9f8f7d3ce64c7cc5ffb3429ecd2b708a654e97c0a7e99d0113e7b2c"} Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.928845 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc"} Feb 19 22:13:28 crc kubenswrapper[4795]: I0219 22:13:28.928883 4795 scope.go:117] "RemoveContainer" containerID="329fea1ee73173abce61c9c393865b560be0cf33332fe7ef2554dba15a0850d0" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.843208 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-98crc"] Feb 19 22:14:56 crc kubenswrapper[4795]: E0219 22:14:56.844063 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerName="extract-utilities" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.844075 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerName="extract-utilities" Feb 19 22:14:56 crc kubenswrapper[4795]: E0219 22:14:56.844090 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerName="registry-server" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.844097 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerName="registry-server" Feb 19 22:14:56 crc kubenswrapper[4795]: E0219 22:14:56.844111 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerName="extract-content" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.844117 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerName="extract-content" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.844257 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5828c8e-eae0-448e-882d-fc02dc4ec6bb" containerName="registry-server" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.845133 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.859574 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-98crc"] Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.920199 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-utilities\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.920579 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-catalog-content\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:56 crc kubenswrapper[4795]: I0219 22:14:56.920604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwxjw\" (UniqueName: \"kubernetes.io/projected/2acef6a5-277e-40ec-bf10-e7da2131e214-kube-api-access-lwxjw\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:57 crc kubenswrapper[4795]: I0219 22:14:57.021907 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwxjw\" (UniqueName: \"kubernetes.io/projected/2acef6a5-277e-40ec-bf10-e7da2131e214-kube-api-access-lwxjw\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:57 crc kubenswrapper[4795]: I0219 22:14:57.021993 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-utilities\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:57 crc kubenswrapper[4795]: I0219 22:14:57.022074 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-catalog-content\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:57 crc kubenswrapper[4795]: I0219 22:14:57.022548 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-utilities\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:57 crc kubenswrapper[4795]: I0219 22:14:57.022631 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-catalog-content\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:57 crc kubenswrapper[4795]: I0219 22:14:57.049940 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwxjw\" (UniqueName: \"kubernetes.io/projected/2acef6a5-277e-40ec-bf10-e7da2131e214-kube-api-access-lwxjw\") pod \"certified-operators-98crc\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:57 crc kubenswrapper[4795]: I0219 22:14:57.167647 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:14:57 crc kubenswrapper[4795]: I0219 22:14:57.639884 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-98crc"] Feb 19 22:14:58 crc kubenswrapper[4795]: I0219 22:14:58.639142 4795 generic.go:334] "Generic (PLEG): container finished" podID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerID="83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3" exitCode=0 Feb 19 22:14:58 crc kubenswrapper[4795]: I0219 22:14:58.639332 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98crc" event={"ID":"2acef6a5-277e-40ec-bf10-e7da2131e214","Type":"ContainerDied","Data":"83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3"} Feb 19 22:14:58 crc kubenswrapper[4795]: I0219 22:14:58.639636 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98crc" event={"ID":"2acef6a5-277e-40ec-bf10-e7da2131e214","Type":"ContainerStarted","Data":"9bbf4153c93ff4e35104fe4418d5dffeea83e979c4b08141e95d740546057f98"} Feb 19 22:14:59 crc kubenswrapper[4795]: I0219 22:14:59.654955 4795 generic.go:334] "Generic (PLEG): container finished" podID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerID="21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2" exitCode=0 Feb 19 22:14:59 crc kubenswrapper[4795]: I0219 22:14:59.655024 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98crc" event={"ID":"2acef6a5-277e-40ec-bf10-e7da2131e214","Type":"ContainerDied","Data":"21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2"} Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.142660 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv"] Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.143786 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.146905 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.146999 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.153600 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv"] Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.272099 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-542fj\" (UniqueName: \"kubernetes.io/projected/1d0ffce6-6c23-4d04-a029-6322d065ff24-kube-api-access-542fj\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.272250 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d0ffce6-6c23-4d04-a029-6322d065ff24-config-volume\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.272282 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d0ffce6-6c23-4d04-a029-6322d065ff24-secret-volume\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.373740 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d0ffce6-6c23-4d04-a029-6322d065ff24-config-volume\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.373791 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d0ffce6-6c23-4d04-a029-6322d065ff24-secret-volume\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.373833 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-542fj\" (UniqueName: \"kubernetes.io/projected/1d0ffce6-6c23-4d04-a029-6322d065ff24-kube-api-access-542fj\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.374721 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d0ffce6-6c23-4d04-a029-6322d065ff24-config-volume\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.379255 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d0ffce6-6c23-4d04-a029-6322d065ff24-secret-volume\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.389248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-542fj\" (UniqueName: \"kubernetes.io/projected/1d0ffce6-6c23-4d04-a029-6322d065ff24-kube-api-access-542fj\") pod \"collect-profiles-29525655-2s9rv\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.479949 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.686188 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv"] Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.689788 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98crc" event={"ID":"2acef6a5-277e-40ec-bf10-e7da2131e214","Type":"ContainerStarted","Data":"a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d"} Feb 19 22:15:00 crc kubenswrapper[4795]: I0219 22:15:00.718304 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-98crc" podStartSLOduration=3.310897461 podStartE2EDuration="4.718285429s" podCreationTimestamp="2026-02-19 22:14:56 +0000 UTC" firstStartedPulling="2026-02-19 22:14:58.641543773 +0000 UTC m=+2809.834061667" lastFinishedPulling="2026-02-19 22:15:00.048931731 +0000 UTC m=+2811.241449635" observedRunningTime="2026-02-19 22:15:00.714469969 +0000 UTC m=+2811.906987843" watchObservedRunningTime="2026-02-19 22:15:00.718285429 +0000 UTC m=+2811.910803293" Feb 19 22:15:01 crc kubenswrapper[4795]: I0219 22:15:01.699143 4795 generic.go:334] "Generic (PLEG): container finished" podID="1d0ffce6-6c23-4d04-a029-6322d065ff24" containerID="e5dc57de5d860d1b9f4b0c7fa5487f6cf98d60033e436fbbba7b629cc254b689" exitCode=0 Feb 19 22:15:01 crc kubenswrapper[4795]: I0219 22:15:01.699197 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" event={"ID":"1d0ffce6-6c23-4d04-a029-6322d065ff24","Type":"ContainerDied","Data":"e5dc57de5d860d1b9f4b0c7fa5487f6cf98d60033e436fbbba7b629cc254b689"} Feb 19 22:15:01 crc kubenswrapper[4795]: I0219 22:15:01.699479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" event={"ID":"1d0ffce6-6c23-4d04-a029-6322d065ff24","Type":"ContainerStarted","Data":"cd961236cfb338693aa8da100c9802e866f4fc6a2a3b71e1259855b1da0ee739"} Feb 19 22:15:02 crc kubenswrapper[4795]: I0219 22:15:02.947101 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.009942 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d0ffce6-6c23-4d04-a029-6322d065ff24-secret-volume\") pod \"1d0ffce6-6c23-4d04-a029-6322d065ff24\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.010345 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d0ffce6-6c23-4d04-a029-6322d065ff24-config-volume\") pod \"1d0ffce6-6c23-4d04-a029-6322d065ff24\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.010392 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-542fj\" (UniqueName: \"kubernetes.io/projected/1d0ffce6-6c23-4d04-a029-6322d065ff24-kube-api-access-542fj\") pod \"1d0ffce6-6c23-4d04-a029-6322d065ff24\" (UID: \"1d0ffce6-6c23-4d04-a029-6322d065ff24\") " Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.011435 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d0ffce6-6c23-4d04-a029-6322d065ff24-config-volume" (OuterVolumeSpecName: "config-volume") pod "1d0ffce6-6c23-4d04-a029-6322d065ff24" (UID: "1d0ffce6-6c23-4d04-a029-6322d065ff24"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.015045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d0ffce6-6c23-4d04-a029-6322d065ff24-kube-api-access-542fj" (OuterVolumeSpecName: "kube-api-access-542fj") pod "1d0ffce6-6c23-4d04-a029-6322d065ff24" (UID: "1d0ffce6-6c23-4d04-a029-6322d065ff24"). InnerVolumeSpecName "kube-api-access-542fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.015102 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d0ffce6-6c23-4d04-a029-6322d065ff24-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1d0ffce6-6c23-4d04-a029-6322d065ff24" (UID: "1d0ffce6-6c23-4d04-a029-6322d065ff24"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.111865 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-542fj\" (UniqueName: \"kubernetes.io/projected/1d0ffce6-6c23-4d04-a029-6322d065ff24-kube-api-access-542fj\") on node \"crc\" DevicePath \"\"" Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.111899 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d0ffce6-6c23-4d04-a029-6322d065ff24-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.111909 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d0ffce6-6c23-4d04-a029-6322d065ff24-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.712566 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" event={"ID":"1d0ffce6-6c23-4d04-a029-6322d065ff24","Type":"ContainerDied","Data":"cd961236cfb338693aa8da100c9802e866f4fc6a2a3b71e1259855b1da0ee739"} Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.712606 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd961236cfb338693aa8da100c9802e866f4fc6a2a3b71e1259855b1da0ee739" Feb 19 22:15:03 crc kubenswrapper[4795]: I0219 22:15:03.712656 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv" Feb 19 22:15:04 crc kubenswrapper[4795]: I0219 22:15:04.030857 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj"] Feb 19 22:15:04 crc kubenswrapper[4795]: I0219 22:15:04.036939 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525610-9dlfj"] Feb 19 22:15:05 crc kubenswrapper[4795]: I0219 22:15:05.528095 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a" path="/var/lib/kubelet/pods/05e7708b-ded6-4dcf-a0d2-1198cf2b0d4a/volumes" Feb 19 22:15:07 crc kubenswrapper[4795]: I0219 22:15:07.167890 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:15:07 crc kubenswrapper[4795]: I0219 22:15:07.168431 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:15:07 crc kubenswrapper[4795]: I0219 22:15:07.222886 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:15:07 crc kubenswrapper[4795]: I0219 22:15:07.784763 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:15:07 crc kubenswrapper[4795]: I0219 22:15:07.838714 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-98crc"] Feb 19 22:15:09 crc kubenswrapper[4795]: I0219 22:15:09.752216 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-98crc" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerName="registry-server" containerID="cri-o://a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d" gracePeriod=2 Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.662369 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.726936 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-utilities\") pod \"2acef6a5-277e-40ec-bf10-e7da2131e214\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.727261 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-catalog-content\") pod \"2acef6a5-277e-40ec-bf10-e7da2131e214\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.727421 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwxjw\" (UniqueName: \"kubernetes.io/projected/2acef6a5-277e-40ec-bf10-e7da2131e214-kube-api-access-lwxjw\") pod \"2acef6a5-277e-40ec-bf10-e7da2131e214\" (UID: \"2acef6a5-277e-40ec-bf10-e7da2131e214\") " Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.728803 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-utilities" (OuterVolumeSpecName: "utilities") pod "2acef6a5-277e-40ec-bf10-e7da2131e214" (UID: "2acef6a5-277e-40ec-bf10-e7da2131e214"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.732680 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2acef6a5-277e-40ec-bf10-e7da2131e214-kube-api-access-lwxjw" (OuterVolumeSpecName: "kube-api-access-lwxjw") pod "2acef6a5-277e-40ec-bf10-e7da2131e214" (UID: "2acef6a5-277e-40ec-bf10-e7da2131e214"). InnerVolumeSpecName "kube-api-access-lwxjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.761116 4795 generic.go:334] "Generic (PLEG): container finished" podID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerID="a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d" exitCode=0 Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.761155 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98crc" event={"ID":"2acef6a5-277e-40ec-bf10-e7da2131e214","Type":"ContainerDied","Data":"a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d"} Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.761197 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98crc" event={"ID":"2acef6a5-277e-40ec-bf10-e7da2131e214","Type":"ContainerDied","Data":"9bbf4153c93ff4e35104fe4418d5dffeea83e979c4b08141e95d740546057f98"} Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.761194 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-98crc" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.761211 4795 scope.go:117] "RemoveContainer" containerID="a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.786319 4795 scope.go:117] "RemoveContainer" containerID="21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.786515 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2acef6a5-277e-40ec-bf10-e7da2131e214" (UID: "2acef6a5-277e-40ec-bf10-e7da2131e214"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.806429 4795 scope.go:117] "RemoveContainer" containerID="83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.829103 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwxjw\" (UniqueName: \"kubernetes.io/projected/2acef6a5-277e-40ec-bf10-e7da2131e214-kube-api-access-lwxjw\") on node \"crc\" DevicePath \"\"" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.829135 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.829144 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acef6a5-277e-40ec-bf10-e7da2131e214-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.837471 4795 scope.go:117] "RemoveContainer" containerID="a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d" Feb 19 22:15:10 crc kubenswrapper[4795]: E0219 22:15:10.837887 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d\": container with ID starting with a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d not found: ID does not exist" containerID="a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.837914 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d"} err="failed to get container status \"a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d\": rpc error: code = NotFound desc = could not find container \"a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d\": container with ID starting with a64e04ef76c8a398a8afdf80a0d2ff7ae5c32954b2cc8d93ea5da2de8db4410d not found: ID does not exist" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.837935 4795 scope.go:117] "RemoveContainer" containerID="21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2" Feb 19 22:15:10 crc kubenswrapper[4795]: E0219 22:15:10.838397 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2\": container with ID starting with 21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2 not found: ID does not exist" containerID="21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.838445 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2"} err="failed to get container status \"21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2\": rpc error: code = NotFound desc = could not find container \"21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2\": container with ID starting with 21e651cede2b6c92056c42aef8e3f5b46a38c10ce5b1b177fcedf006410c15d2 not found: ID does not exist" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.838474 4795 scope.go:117] "RemoveContainer" containerID="83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3" Feb 19 22:15:10 crc kubenswrapper[4795]: E0219 22:15:10.838800 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3\": container with ID starting with 83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3 not found: ID does not exist" containerID="83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3" Feb 19 22:15:10 crc kubenswrapper[4795]: I0219 22:15:10.838835 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3"} err="failed to get container status \"83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3\": rpc error: code = NotFound desc = could not find container \"83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3\": container with ID starting with 83f92a64ce726c3bf68e2c8fbda9cdd63c9ac951c73af7b82ca4b45d0f0751a3 not found: ID does not exist" Feb 19 22:15:11 crc kubenswrapper[4795]: I0219 22:15:11.086607 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-98crc"] Feb 19 22:15:11 crc kubenswrapper[4795]: I0219 22:15:11.091334 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-98crc"] Feb 19 22:15:11 crc kubenswrapper[4795]: I0219 22:15:11.529826 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" path="/var/lib/kubelet/pods/2acef6a5-277e-40ec-bf10-e7da2131e214/volumes" Feb 19 22:15:14 crc kubenswrapper[4795]: I0219 22:15:14.117112 4795 scope.go:117] "RemoveContainer" containerID="8f02da0738e9d178691c0389499a8480ffbd2f961710f2202fa44516bf81c4c6" Feb 19 22:15:28 crc kubenswrapper[4795]: I0219 22:15:28.427111 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:15:28 crc kubenswrapper[4795]: I0219 22:15:28.427705 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:15:58 crc kubenswrapper[4795]: I0219 22:15:58.428531 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:15:58 crc kubenswrapper[4795]: I0219 22:15:58.430349 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:16:28 crc kubenswrapper[4795]: I0219 22:16:28.427752 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:16:28 crc kubenswrapper[4795]: I0219 22:16:28.428665 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:16:28 crc kubenswrapper[4795]: I0219 22:16:28.428755 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:16:28 crc kubenswrapper[4795]: I0219 22:16:28.429900 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:16:28 crc kubenswrapper[4795]: I0219 22:16:28.430045 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" gracePeriod=600 Feb 19 22:16:28 crc kubenswrapper[4795]: E0219 22:16:28.560045 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:16:29 crc kubenswrapper[4795]: I0219 22:16:29.388829 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" exitCode=0 Feb 19 22:16:29 crc kubenswrapper[4795]: I0219 22:16:29.388889 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc"} Feb 19 22:16:29 crc kubenswrapper[4795]: I0219 22:16:29.388942 4795 scope.go:117] "RemoveContainer" containerID="dae87592e9f8f7d3ce64c7cc5ffb3429ecd2b708a654e97c0a7e99d0113e7b2c" Feb 19 22:16:29 crc kubenswrapper[4795]: I0219 22:16:29.389490 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:16:29 crc kubenswrapper[4795]: E0219 22:16:29.389720 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:16:41 crc kubenswrapper[4795]: I0219 22:16:41.512802 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:16:41 crc kubenswrapper[4795]: E0219 22:16:41.513412 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:16:53 crc kubenswrapper[4795]: I0219 22:16:53.513232 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:16:53 crc kubenswrapper[4795]: E0219 22:16:53.514406 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:17:04 crc kubenswrapper[4795]: I0219 22:17:04.512206 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:17:04 crc kubenswrapper[4795]: E0219 22:17:04.513530 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:17:17 crc kubenswrapper[4795]: I0219 22:17:17.511638 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:17:17 crc kubenswrapper[4795]: E0219 22:17:17.512404 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:17:32 crc kubenswrapper[4795]: I0219 22:17:32.512120 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:17:32 crc kubenswrapper[4795]: E0219 22:17:32.513317 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:17:47 crc kubenswrapper[4795]: I0219 22:17:47.511730 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:17:47 crc kubenswrapper[4795]: E0219 22:17:47.512899 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:18:01 crc kubenswrapper[4795]: I0219 22:18:01.512346 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:18:01 crc kubenswrapper[4795]: E0219 22:18:01.513080 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:18:13 crc kubenswrapper[4795]: I0219 22:18:13.514135 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:18:13 crc kubenswrapper[4795]: E0219 22:18:13.514906 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:18:28 crc kubenswrapper[4795]: I0219 22:18:28.512301 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:18:28 crc kubenswrapper[4795]: E0219 22:18:28.513411 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:18:43 crc kubenswrapper[4795]: I0219 22:18:43.511970 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:18:43 crc kubenswrapper[4795]: E0219 22:18:43.512676 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:18:58 crc kubenswrapper[4795]: I0219 22:18:58.511647 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:18:58 crc kubenswrapper[4795]: E0219 22:18:58.512573 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:19:12 crc kubenswrapper[4795]: I0219 22:19:12.512010 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:19:12 crc kubenswrapper[4795]: E0219 22:19:12.512634 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:19:25 crc kubenswrapper[4795]: I0219 22:19:25.511756 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:19:25 crc kubenswrapper[4795]: E0219 22:19:25.512515 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:19:39 crc kubenswrapper[4795]: I0219 22:19:39.544572 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:19:39 crc kubenswrapper[4795]: E0219 22:19:39.545575 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:19:53 crc kubenswrapper[4795]: I0219 22:19:53.512599 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:19:53 crc kubenswrapper[4795]: E0219 22:19:53.513549 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:20:04 crc kubenswrapper[4795]: I0219 22:20:04.511945 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:20:04 crc kubenswrapper[4795]: E0219 22:20:04.512473 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.424103 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mftpp"] Feb 19 22:20:06 crc kubenswrapper[4795]: E0219 22:20:06.424668 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d0ffce6-6c23-4d04-a029-6322d065ff24" containerName="collect-profiles" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.424684 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0ffce6-6c23-4d04-a029-6322d065ff24" containerName="collect-profiles" Feb 19 22:20:06 crc kubenswrapper[4795]: E0219 22:20:06.424694 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerName="extract-utilities" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.424699 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerName="extract-utilities" Feb 19 22:20:06 crc kubenswrapper[4795]: E0219 22:20:06.424713 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerName="extract-content" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.424720 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerName="extract-content" Feb 19 22:20:06 crc kubenswrapper[4795]: E0219 22:20:06.424729 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerName="registry-server" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.424737 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerName="registry-server" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.424898 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2acef6a5-277e-40ec-bf10-e7da2131e214" containerName="registry-server" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.424913 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d0ffce6-6c23-4d04-a029-6322d065ff24" containerName="collect-profiles" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.425842 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.438358 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mftpp"] Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.606035 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-utilities\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.606439 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk55d\" (UniqueName: \"kubernetes.io/projected/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-kube-api-access-wk55d\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.606563 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-catalog-content\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.708044 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk55d\" (UniqueName: \"kubernetes.io/projected/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-kube-api-access-wk55d\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.708392 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-catalog-content\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.708526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-utilities\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.708824 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-catalog-content\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.708932 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-utilities\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.728650 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk55d\" (UniqueName: \"kubernetes.io/projected/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-kube-api-access-wk55d\") pod \"redhat-operators-mftpp\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:06 crc kubenswrapper[4795]: I0219 22:20:06.746011 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:07 crc kubenswrapper[4795]: I0219 22:20:07.206505 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mftpp"] Feb 19 22:20:07 crc kubenswrapper[4795]: I0219 22:20:07.742686 4795 generic.go:334] "Generic (PLEG): container finished" podID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerID="30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d" exitCode=0 Feb 19 22:20:07 crc kubenswrapper[4795]: I0219 22:20:07.742991 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mftpp" event={"ID":"cda983e9-8c54-4e35-aa6a-a3ae6501c46e","Type":"ContainerDied","Data":"30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d"} Feb 19 22:20:07 crc kubenswrapper[4795]: I0219 22:20:07.743024 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mftpp" event={"ID":"cda983e9-8c54-4e35-aa6a-a3ae6501c46e","Type":"ContainerStarted","Data":"92f4e566bfbad9d24382f5e82cfab642304375b17d5b3ef1979e50a28f7f804a"} Feb 19 22:20:07 crc kubenswrapper[4795]: I0219 22:20:07.744196 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:20:08 crc kubenswrapper[4795]: I0219 22:20:08.757682 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mftpp" event={"ID":"cda983e9-8c54-4e35-aa6a-a3ae6501c46e","Type":"ContainerStarted","Data":"e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10"} Feb 19 22:20:09 crc kubenswrapper[4795]: I0219 22:20:09.765355 4795 generic.go:334] "Generic (PLEG): container finished" podID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerID="e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10" exitCode=0 Feb 19 22:20:09 crc kubenswrapper[4795]: I0219 22:20:09.765429 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mftpp" event={"ID":"cda983e9-8c54-4e35-aa6a-a3ae6501c46e","Type":"ContainerDied","Data":"e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10"} Feb 19 22:20:10 crc kubenswrapper[4795]: I0219 22:20:10.774130 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mftpp" event={"ID":"cda983e9-8c54-4e35-aa6a-a3ae6501c46e","Type":"ContainerStarted","Data":"57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261"} Feb 19 22:20:10 crc kubenswrapper[4795]: I0219 22:20:10.793514 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mftpp" podStartSLOduration=2.399963529 podStartE2EDuration="4.793495513s" podCreationTimestamp="2026-02-19 22:20:06 +0000 UTC" firstStartedPulling="2026-02-19 22:20:07.743908219 +0000 UTC m=+3118.936426083" lastFinishedPulling="2026-02-19 22:20:10.137440213 +0000 UTC m=+3121.329958067" observedRunningTime="2026-02-19 22:20:10.791228986 +0000 UTC m=+3121.983746860" watchObservedRunningTime="2026-02-19 22:20:10.793495513 +0000 UTC m=+3121.986013377" Feb 19 22:20:17 crc kubenswrapper[4795]: I0219 22:20:17.110106 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:17 crc kubenswrapper[4795]: I0219 22:20:17.112014 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:18 crc kubenswrapper[4795]: I0219 22:20:18.253914 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mftpp" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="registry-server" probeResult="failure" output=< Feb 19 22:20:18 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 22:20:18 crc kubenswrapper[4795]: > Feb 19 22:20:18 crc kubenswrapper[4795]: I0219 22:20:18.511658 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:20:18 crc kubenswrapper[4795]: E0219 22:20:18.512155 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:20:26 crc kubenswrapper[4795]: I0219 22:20:26.781953 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:26 crc kubenswrapper[4795]: I0219 22:20:26.834522 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:27 crc kubenswrapper[4795]: I0219 22:20:27.017613 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mftpp"] Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.234067 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mftpp" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="registry-server" containerID="cri-o://57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261" gracePeriod=2 Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.599093 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.766369 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-utilities\") pod \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.766456 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-catalog-content\") pod \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.766491 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk55d\" (UniqueName: \"kubernetes.io/projected/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-kube-api-access-wk55d\") pod \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\" (UID: \"cda983e9-8c54-4e35-aa6a-a3ae6501c46e\") " Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.767104 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-utilities" (OuterVolumeSpecName: "utilities") pod "cda983e9-8c54-4e35-aa6a-a3ae6501c46e" (UID: "cda983e9-8c54-4e35-aa6a-a3ae6501c46e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.771725 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-kube-api-access-wk55d" (OuterVolumeSpecName: "kube-api-access-wk55d") pod "cda983e9-8c54-4e35-aa6a-a3ae6501c46e" (UID: "cda983e9-8c54-4e35-aa6a-a3ae6501c46e"). InnerVolumeSpecName "kube-api-access-wk55d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.868231 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.868270 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk55d\" (UniqueName: \"kubernetes.io/projected/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-kube-api-access-wk55d\") on node \"crc\" DevicePath \"\"" Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.885544 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cda983e9-8c54-4e35-aa6a-a3ae6501c46e" (UID: "cda983e9-8c54-4e35-aa6a-a3ae6501c46e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:20:28 crc kubenswrapper[4795]: I0219 22:20:28.969900 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cda983e9-8c54-4e35-aa6a-a3ae6501c46e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.245222 4795 generic.go:334] "Generic (PLEG): container finished" podID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerID="57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261" exitCode=0 Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.245288 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mftpp" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.245294 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mftpp" event={"ID":"cda983e9-8c54-4e35-aa6a-a3ae6501c46e","Type":"ContainerDied","Data":"57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261"} Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.245362 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mftpp" event={"ID":"cda983e9-8c54-4e35-aa6a-a3ae6501c46e","Type":"ContainerDied","Data":"92f4e566bfbad9d24382f5e82cfab642304375b17d5b3ef1979e50a28f7f804a"} Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.245390 4795 scope.go:117] "RemoveContainer" containerID="57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.265700 4795 scope.go:117] "RemoveContainer" containerID="e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.279359 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mftpp"] Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.292466 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mftpp"] Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.306433 4795 scope.go:117] "RemoveContainer" containerID="30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.323698 4795 scope.go:117] "RemoveContainer" containerID="57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261" Feb 19 22:20:29 crc kubenswrapper[4795]: E0219 22:20:29.325938 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261\": container with ID starting with 57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261 not found: ID does not exist" containerID="57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.325983 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261"} err="failed to get container status \"57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261\": rpc error: code = NotFound desc = could not find container \"57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261\": container with ID starting with 57616f97cc3e6ed6bae93b783711e80303cffe521c28b47b5142d77329d68261 not found: ID does not exist" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.326020 4795 scope.go:117] "RemoveContainer" containerID="e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10" Feb 19 22:20:29 crc kubenswrapper[4795]: E0219 22:20:29.326489 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10\": container with ID starting with e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10 not found: ID does not exist" containerID="e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.326543 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10"} err="failed to get container status \"e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10\": rpc error: code = NotFound desc = could not find container \"e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10\": container with ID starting with e66cef71d209ee780a2f0c8088d62b4d7d67e10f5089b829b0ff0443ea452b10 not found: ID does not exist" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.326580 4795 scope.go:117] "RemoveContainer" containerID="30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d" Feb 19 22:20:29 crc kubenswrapper[4795]: E0219 22:20:29.327011 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d\": container with ID starting with 30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d not found: ID does not exist" containerID="30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.327052 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d"} err="failed to get container status \"30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d\": rpc error: code = NotFound desc = could not find container \"30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d\": container with ID starting with 30bf44d3b22bfa515599ddb948f8701928949b25404fed453e4217fa382b7d0d not found: ID does not exist" Feb 19 22:20:29 crc kubenswrapper[4795]: I0219 22:20:29.523785 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" path="/var/lib/kubelet/pods/cda983e9-8c54-4e35-aa6a-a3ae6501c46e/volumes" Feb 19 22:20:33 crc kubenswrapper[4795]: I0219 22:20:33.511871 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:20:33 crc kubenswrapper[4795]: E0219 22:20:33.512557 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:20:45 crc kubenswrapper[4795]: I0219 22:20:45.512536 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:20:45 crc kubenswrapper[4795]: E0219 22:20:45.514994 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:20:47 crc kubenswrapper[4795]: I0219 22:20:47.866981 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j6m5s"] Feb 19 22:20:47 crc kubenswrapper[4795]: E0219 22:20:47.867575 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="registry-server" Feb 19 22:20:47 crc kubenswrapper[4795]: I0219 22:20:47.867590 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="registry-server" Feb 19 22:20:47 crc kubenswrapper[4795]: E0219 22:20:47.867610 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="extract-content" Feb 19 22:20:47 crc kubenswrapper[4795]: I0219 22:20:47.867619 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="extract-content" Feb 19 22:20:47 crc kubenswrapper[4795]: E0219 22:20:47.867641 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="extract-utilities" Feb 19 22:20:47 crc kubenswrapper[4795]: I0219 22:20:47.867650 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="extract-utilities" Feb 19 22:20:47 crc kubenswrapper[4795]: I0219 22:20:47.867824 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cda983e9-8c54-4e35-aa6a-a3ae6501c46e" containerName="registry-server" Feb 19 22:20:47 crc kubenswrapper[4795]: I0219 22:20:47.869028 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:47 crc kubenswrapper[4795]: I0219 22:20:47.882844 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6m5s"] Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.004320 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-catalog-content\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.004476 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-utilities\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.004586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgmvs\" (UniqueName: \"kubernetes.io/projected/583aaf4a-8b98-4385-b16a-009ddc9d03c1-kube-api-access-dgmvs\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.105971 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgmvs\" (UniqueName: \"kubernetes.io/projected/583aaf4a-8b98-4385-b16a-009ddc9d03c1-kube-api-access-dgmvs\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.106073 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-catalog-content\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.106110 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-utilities\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.106884 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-utilities\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.107038 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-catalog-content\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.128784 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgmvs\" (UniqueName: \"kubernetes.io/projected/583aaf4a-8b98-4385-b16a-009ddc9d03c1-kube-api-access-dgmvs\") pod \"redhat-marketplace-j6m5s\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.190206 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.452846 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6m5s"] Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.925720 4795 generic.go:334] "Generic (PLEG): container finished" podID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerID="5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33" exitCode=0 Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.925783 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6m5s" event={"ID":"583aaf4a-8b98-4385-b16a-009ddc9d03c1","Type":"ContainerDied","Data":"5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33"} Feb 19 22:20:48 crc kubenswrapper[4795]: I0219 22:20:48.926072 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6m5s" event={"ID":"583aaf4a-8b98-4385-b16a-009ddc9d03c1","Type":"ContainerStarted","Data":"b756a66576e7cccaecd610df95a2f875171020cbdb675495d633bcadca6e3e32"} Feb 19 22:20:49 crc kubenswrapper[4795]: I0219 22:20:49.937040 4795 generic.go:334] "Generic (PLEG): container finished" podID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerID="242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b" exitCode=0 Feb 19 22:20:49 crc kubenswrapper[4795]: I0219 22:20:49.937177 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6m5s" event={"ID":"583aaf4a-8b98-4385-b16a-009ddc9d03c1","Type":"ContainerDied","Data":"242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b"} Feb 19 22:20:50 crc kubenswrapper[4795]: I0219 22:20:50.948222 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6m5s" event={"ID":"583aaf4a-8b98-4385-b16a-009ddc9d03c1","Type":"ContainerStarted","Data":"b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9"} Feb 19 22:20:50 crc kubenswrapper[4795]: I0219 22:20:50.969940 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j6m5s" podStartSLOduration=2.535970871 podStartE2EDuration="3.969885321s" podCreationTimestamp="2026-02-19 22:20:47 +0000 UTC" firstStartedPulling="2026-02-19 22:20:48.927960411 +0000 UTC m=+3160.120478275" lastFinishedPulling="2026-02-19 22:20:50.361874861 +0000 UTC m=+3161.554392725" observedRunningTime="2026-02-19 22:20:50.966241897 +0000 UTC m=+3162.158759801" watchObservedRunningTime="2026-02-19 22:20:50.969885321 +0000 UTC m=+3162.162403205" Feb 19 22:20:57 crc kubenswrapper[4795]: I0219 22:20:57.512556 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:20:57 crc kubenswrapper[4795]: E0219 22:20:57.513484 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:20:58 crc kubenswrapper[4795]: I0219 22:20:58.191358 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:58 crc kubenswrapper[4795]: I0219 22:20:58.191424 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:58 crc kubenswrapper[4795]: I0219 22:20:58.256620 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:59 crc kubenswrapper[4795]: I0219 22:20:59.089814 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:20:59 crc kubenswrapper[4795]: I0219 22:20:59.155501 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6m5s"] Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.030465 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j6m5s" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerName="registry-server" containerID="cri-o://b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9" gracePeriod=2 Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.477995 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.596297 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-utilities\") pod \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.596434 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgmvs\" (UniqueName: \"kubernetes.io/projected/583aaf4a-8b98-4385-b16a-009ddc9d03c1-kube-api-access-dgmvs\") pod \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.596524 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-catalog-content\") pod \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\" (UID: \"583aaf4a-8b98-4385-b16a-009ddc9d03c1\") " Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.597606 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-utilities" (OuterVolumeSpecName: "utilities") pod "583aaf4a-8b98-4385-b16a-009ddc9d03c1" (UID: "583aaf4a-8b98-4385-b16a-009ddc9d03c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.611664 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/583aaf4a-8b98-4385-b16a-009ddc9d03c1-kube-api-access-dgmvs" (OuterVolumeSpecName: "kube-api-access-dgmvs") pod "583aaf4a-8b98-4385-b16a-009ddc9d03c1" (UID: "583aaf4a-8b98-4385-b16a-009ddc9d03c1"). InnerVolumeSpecName "kube-api-access-dgmvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.621039 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "583aaf4a-8b98-4385-b16a-009ddc9d03c1" (UID: "583aaf4a-8b98-4385-b16a-009ddc9d03c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.697956 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgmvs\" (UniqueName: \"kubernetes.io/projected/583aaf4a-8b98-4385-b16a-009ddc9d03c1-kube-api-access-dgmvs\") on node \"crc\" DevicePath \"\"" Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.698228 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:21:01 crc kubenswrapper[4795]: I0219 22:21:01.698295 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/583aaf4a-8b98-4385-b16a-009ddc9d03c1-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.043490 4795 generic.go:334] "Generic (PLEG): container finished" podID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerID="b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9" exitCode=0 Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.043533 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6m5s" event={"ID":"583aaf4a-8b98-4385-b16a-009ddc9d03c1","Type":"ContainerDied","Data":"b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9"} Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.043560 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6m5s" event={"ID":"583aaf4a-8b98-4385-b16a-009ddc9d03c1","Type":"ContainerDied","Data":"b756a66576e7cccaecd610df95a2f875171020cbdb675495d633bcadca6e3e32"} Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.043584 4795 scope.go:117] "RemoveContainer" containerID="b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.043655 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j6m5s" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.071291 4795 scope.go:117] "RemoveContainer" containerID="242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.085339 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6m5s"] Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.090004 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6m5s"] Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.107571 4795 scope.go:117] "RemoveContainer" containerID="5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.147086 4795 scope.go:117] "RemoveContainer" containerID="b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9" Feb 19 22:21:02 crc kubenswrapper[4795]: E0219 22:21:02.147674 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9\": container with ID starting with b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9 not found: ID does not exist" containerID="b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.147732 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9"} err="failed to get container status \"b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9\": rpc error: code = NotFound desc = could not find container \"b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9\": container with ID starting with b0e173b2579c3f35c12a29aff93a1ebf8d4ad9804faefc64bf3d9f48f42e7bf9 not found: ID does not exist" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.147757 4795 scope.go:117] "RemoveContainer" containerID="242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b" Feb 19 22:21:02 crc kubenswrapper[4795]: E0219 22:21:02.148149 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b\": container with ID starting with 242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b not found: ID does not exist" containerID="242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.148262 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b"} err="failed to get container status \"242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b\": rpc error: code = NotFound desc = could not find container \"242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b\": container with ID starting with 242c1278d3b75e924f451cdde3eb750c98b500bf199d980f136750a047a5320b not found: ID does not exist" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.148340 4795 scope.go:117] "RemoveContainer" containerID="5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33" Feb 19 22:21:02 crc kubenswrapper[4795]: E0219 22:21:02.148754 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33\": container with ID starting with 5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33 not found: ID does not exist" containerID="5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33" Feb 19 22:21:02 crc kubenswrapper[4795]: I0219 22:21:02.148785 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33"} err="failed to get container status \"5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33\": rpc error: code = NotFound desc = could not find container \"5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33\": container with ID starting with 5c7275e62f7372a36164f21f6ec2134e29ed2f9c38f129655aa63e1205e38c33 not found: ID does not exist" Feb 19 22:21:03 crc kubenswrapper[4795]: I0219 22:21:03.530393 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" path="/var/lib/kubelet/pods/583aaf4a-8b98-4385-b16a-009ddc9d03c1/volumes" Feb 19 22:21:08 crc kubenswrapper[4795]: I0219 22:21:08.511604 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:21:08 crc kubenswrapper[4795]: E0219 22:21:08.512030 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:21:23 crc kubenswrapper[4795]: I0219 22:21:23.511842 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:21:23 crc kubenswrapper[4795]: E0219 22:21:23.512828 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:21:37 crc kubenswrapper[4795]: I0219 22:21:37.512119 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:21:38 crc kubenswrapper[4795]: I0219 22:21:38.396397 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"0edfa681896b7b4c4053886f43d282fa1e3dc3cde5816d95a6de237745fabdcb"} Feb 19 22:23:58 crc kubenswrapper[4795]: I0219 22:23:58.427962 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:23:58 crc kubenswrapper[4795]: I0219 22:23:58.428769 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:24:28 crc kubenswrapper[4795]: I0219 22:24:28.427748 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:24:28 crc kubenswrapper[4795]: I0219 22:24:28.428559 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.428048 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.428608 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.428652 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.429208 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0edfa681896b7b4c4053886f43d282fa1e3dc3cde5816d95a6de237745fabdcb"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.429283 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://0edfa681896b7b4c4053886f43d282fa1e3dc3cde5816d95a6de237745fabdcb" gracePeriod=600 Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.936391 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="0edfa681896b7b4c4053886f43d282fa1e3dc3cde5816d95a6de237745fabdcb" exitCode=0 Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.936447 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"0edfa681896b7b4c4053886f43d282fa1e3dc3cde5816d95a6de237745fabdcb"} Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.936618 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f"} Feb 19 22:24:58 crc kubenswrapper[4795]: I0219 22:24:58.936640 4795 scope.go:117] "RemoveContainer" containerID="cb2da55360bfe87a3436787aedede72cb428a9ef06e2974cbb48cbc34918b2bc" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.012332 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7zgnn"] Feb 19 22:25:15 crc kubenswrapper[4795]: E0219 22:25:15.013343 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerName="extract-utilities" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.013360 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerName="extract-utilities" Feb 19 22:25:15 crc kubenswrapper[4795]: E0219 22:25:15.013379 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerName="registry-server" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.013386 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerName="registry-server" Feb 19 22:25:15 crc kubenswrapper[4795]: E0219 22:25:15.013417 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerName="extract-content" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.013424 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerName="extract-content" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.013591 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="583aaf4a-8b98-4385-b16a-009ddc9d03c1" containerName="registry-server" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.014718 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.024833 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zgnn"] Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.072308 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-utilities\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.072391 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7jzh\" (UniqueName: \"kubernetes.io/projected/369db8af-2908-4268-9589-a73559afc23d-kube-api-access-z7jzh\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.072429 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-catalog-content\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.173916 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7jzh\" (UniqueName: \"kubernetes.io/projected/369db8af-2908-4268-9589-a73559afc23d-kube-api-access-z7jzh\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.173985 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-catalog-content\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.174021 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-utilities\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.174635 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-catalog-content\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.174722 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-utilities\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.195310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7jzh\" (UniqueName: \"kubernetes.io/projected/369db8af-2908-4268-9589-a73559afc23d-kube-api-access-z7jzh\") pod \"certified-operators-7zgnn\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.332974 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:15 crc kubenswrapper[4795]: I0219 22:25:15.791308 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zgnn"] Feb 19 22:25:16 crc kubenswrapper[4795]: I0219 22:25:16.086344 4795 generic.go:334] "Generic (PLEG): container finished" podID="369db8af-2908-4268-9589-a73559afc23d" containerID="0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04" exitCode=0 Feb 19 22:25:16 crc kubenswrapper[4795]: I0219 22:25:16.086387 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zgnn" event={"ID":"369db8af-2908-4268-9589-a73559afc23d","Type":"ContainerDied","Data":"0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04"} Feb 19 22:25:16 crc kubenswrapper[4795]: I0219 22:25:16.086410 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zgnn" event={"ID":"369db8af-2908-4268-9589-a73559afc23d","Type":"ContainerStarted","Data":"7586dc34590369cf1598e1b2c64ff5204daaaef5a0c2190d004748d61e3ad765"} Feb 19 22:25:16 crc kubenswrapper[4795]: I0219 22:25:16.088650 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:25:17 crc kubenswrapper[4795]: I0219 22:25:17.095350 4795 generic.go:334] "Generic (PLEG): container finished" podID="369db8af-2908-4268-9589-a73559afc23d" containerID="dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed" exitCode=0 Feb 19 22:25:17 crc kubenswrapper[4795]: I0219 22:25:17.095407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zgnn" event={"ID":"369db8af-2908-4268-9589-a73559afc23d","Type":"ContainerDied","Data":"dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed"} Feb 19 22:25:18 crc kubenswrapper[4795]: I0219 22:25:18.107374 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zgnn" event={"ID":"369db8af-2908-4268-9589-a73559afc23d","Type":"ContainerStarted","Data":"26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351"} Feb 19 22:25:18 crc kubenswrapper[4795]: I0219 22:25:18.132572 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7zgnn" podStartSLOduration=2.717176035 podStartE2EDuration="4.132542546s" podCreationTimestamp="2026-02-19 22:25:14 +0000 UTC" firstStartedPulling="2026-02-19 22:25:16.088445772 +0000 UTC m=+3427.280963636" lastFinishedPulling="2026-02-19 22:25:17.503812253 +0000 UTC m=+3428.696330147" observedRunningTime="2026-02-19 22:25:18.126459613 +0000 UTC m=+3429.318977497" watchObservedRunningTime="2026-02-19 22:25:18.132542546 +0000 UTC m=+3429.325060450" Feb 19 22:25:25 crc kubenswrapper[4795]: I0219 22:25:25.333811 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:25 crc kubenswrapper[4795]: I0219 22:25:25.334470 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:25 crc kubenswrapper[4795]: I0219 22:25:25.387724 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:26 crc kubenswrapper[4795]: I0219 22:25:26.199404 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:26 crc kubenswrapper[4795]: I0219 22:25:26.248712 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zgnn"] Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.187475 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7zgnn" podUID="369db8af-2908-4268-9589-a73559afc23d" containerName="registry-server" containerID="cri-o://26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351" gracePeriod=2 Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.751835 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.802717 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7jzh\" (UniqueName: \"kubernetes.io/projected/369db8af-2908-4268-9589-a73559afc23d-kube-api-access-z7jzh\") pod \"369db8af-2908-4268-9589-a73559afc23d\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.802844 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-catalog-content\") pod \"369db8af-2908-4268-9589-a73559afc23d\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.802882 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-utilities\") pod \"369db8af-2908-4268-9589-a73559afc23d\" (UID: \"369db8af-2908-4268-9589-a73559afc23d\") " Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.803689 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-utilities" (OuterVolumeSpecName: "utilities") pod "369db8af-2908-4268-9589-a73559afc23d" (UID: "369db8af-2908-4268-9589-a73559afc23d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.808271 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/369db8af-2908-4268-9589-a73559afc23d-kube-api-access-z7jzh" (OuterVolumeSpecName: "kube-api-access-z7jzh") pod "369db8af-2908-4268-9589-a73559afc23d" (UID: "369db8af-2908-4268-9589-a73559afc23d"). InnerVolumeSpecName "kube-api-access-z7jzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.851983 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "369db8af-2908-4268-9589-a73559afc23d" (UID: "369db8af-2908-4268-9589-a73559afc23d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.904076 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.904111 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7jzh\" (UniqueName: \"kubernetes.io/projected/369db8af-2908-4268-9589-a73559afc23d-kube-api-access-z7jzh\") on node \"crc\" DevicePath \"\"" Feb 19 22:25:28 crc kubenswrapper[4795]: I0219 22:25:28.904122 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/369db8af-2908-4268-9589-a73559afc23d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.198974 4795 generic.go:334] "Generic (PLEG): container finished" podID="369db8af-2908-4268-9589-a73559afc23d" containerID="26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351" exitCode=0 Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.199024 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zgnn" event={"ID":"369db8af-2908-4268-9589-a73559afc23d","Type":"ContainerDied","Data":"26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351"} Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.199048 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zgnn" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.199067 4795 scope.go:117] "RemoveContainer" containerID="26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.199054 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zgnn" event={"ID":"369db8af-2908-4268-9589-a73559afc23d","Type":"ContainerDied","Data":"7586dc34590369cf1598e1b2c64ff5204daaaef5a0c2190d004748d61e3ad765"} Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.233781 4795 scope.go:117] "RemoveContainer" containerID="dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.239078 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zgnn"] Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.244625 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7zgnn"] Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.255604 4795 scope.go:117] "RemoveContainer" containerID="0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.279134 4795 scope.go:117] "RemoveContainer" containerID="26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351" Feb 19 22:25:29 crc kubenswrapper[4795]: E0219 22:25:29.279825 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351\": container with ID starting with 26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351 not found: ID does not exist" containerID="26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.279869 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351"} err="failed to get container status \"26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351\": rpc error: code = NotFound desc = could not find container \"26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351\": container with ID starting with 26ddb0d333102c1d7fa50e8a29a4524dbfcbd602fb6b4474c4fb485995ac3351 not found: ID does not exist" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.279898 4795 scope.go:117] "RemoveContainer" containerID="dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed" Feb 19 22:25:29 crc kubenswrapper[4795]: E0219 22:25:29.280253 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed\": container with ID starting with dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed not found: ID does not exist" containerID="dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.280323 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed"} err="failed to get container status \"dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed\": rpc error: code = NotFound desc = could not find container \"dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed\": container with ID starting with dab2f634983d085c5c877b27a71167baa639e42c55d0cae668d3dc56006484ed not found: ID does not exist" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.280373 4795 scope.go:117] "RemoveContainer" containerID="0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04" Feb 19 22:25:29 crc kubenswrapper[4795]: E0219 22:25:29.280826 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04\": container with ID starting with 0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04 not found: ID does not exist" containerID="0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.280869 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04"} err="failed to get container status \"0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04\": rpc error: code = NotFound desc = could not find container \"0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04\": container with ID starting with 0475702ad800a7b7283d8dcc3c1114d18b94d85b153da8165185a5176c3e9d04 not found: ID does not exist" Feb 19 22:25:29 crc kubenswrapper[4795]: I0219 22:25:29.520702 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="369db8af-2908-4268-9589-a73559afc23d" path="/var/lib/kubelet/pods/369db8af-2908-4268-9589-a73559afc23d/volumes" Feb 19 22:26:58 crc kubenswrapper[4795]: I0219 22:26:58.428106 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:26:58 crc kubenswrapper[4795]: I0219 22:26:58.428771 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.837602 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2rc7n"] Feb 19 22:27:00 crc kubenswrapper[4795]: E0219 22:27:00.838844 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369db8af-2908-4268-9589-a73559afc23d" containerName="extract-utilities" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.838878 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="369db8af-2908-4268-9589-a73559afc23d" containerName="extract-utilities" Feb 19 22:27:00 crc kubenswrapper[4795]: E0219 22:27:00.838928 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369db8af-2908-4268-9589-a73559afc23d" containerName="registry-server" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.838949 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="369db8af-2908-4268-9589-a73559afc23d" containerName="registry-server" Feb 19 22:27:00 crc kubenswrapper[4795]: E0219 22:27:00.838979 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369db8af-2908-4268-9589-a73559afc23d" containerName="extract-content" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.839018 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="369db8af-2908-4268-9589-a73559afc23d" containerName="extract-content" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.839445 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="369db8af-2908-4268-9589-a73559afc23d" containerName="registry-server" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.841965 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.852290 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2rc7n"] Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.913234 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-catalog-content\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.913304 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55bbf\" (UniqueName: \"kubernetes.io/projected/f690af90-79ff-44b5-88ad-970bfe721e55-kube-api-access-55bbf\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:00 crc kubenswrapper[4795]: I0219 22:27:00.913692 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-utilities\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.015960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-catalog-content\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.016037 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55bbf\" (UniqueName: \"kubernetes.io/projected/f690af90-79ff-44b5-88ad-970bfe721e55-kube-api-access-55bbf\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.016119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-utilities\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.016647 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-catalog-content\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.016675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-utilities\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.045840 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55bbf\" (UniqueName: \"kubernetes.io/projected/f690af90-79ff-44b5-88ad-970bfe721e55-kube-api-access-55bbf\") pod \"community-operators-2rc7n\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.188515 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.491523 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2rc7n"] Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.958274 4795 generic.go:334] "Generic (PLEG): container finished" podID="f690af90-79ff-44b5-88ad-970bfe721e55" containerID="c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee" exitCode=0 Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.958354 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rc7n" event={"ID":"f690af90-79ff-44b5-88ad-970bfe721e55","Type":"ContainerDied","Data":"c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee"} Feb 19 22:27:01 crc kubenswrapper[4795]: I0219 22:27:01.958607 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rc7n" event={"ID":"f690af90-79ff-44b5-88ad-970bfe721e55","Type":"ContainerStarted","Data":"0f6c7a9b355c7198402cd2d227bffb002fc0b92346bf0bc97737e356a9a96ce1"} Feb 19 22:27:02 crc kubenswrapper[4795]: I0219 22:27:02.967978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rc7n" event={"ID":"f690af90-79ff-44b5-88ad-970bfe721e55","Type":"ContainerStarted","Data":"dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc"} Feb 19 22:27:03 crc kubenswrapper[4795]: I0219 22:27:03.980693 4795 generic.go:334] "Generic (PLEG): container finished" podID="f690af90-79ff-44b5-88ad-970bfe721e55" containerID="dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc" exitCode=0 Feb 19 22:27:03 crc kubenswrapper[4795]: I0219 22:27:03.980768 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rc7n" event={"ID":"f690af90-79ff-44b5-88ad-970bfe721e55","Type":"ContainerDied","Data":"dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc"} Feb 19 22:27:04 crc kubenswrapper[4795]: I0219 22:27:04.991958 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rc7n" event={"ID":"f690af90-79ff-44b5-88ad-970bfe721e55","Type":"ContainerStarted","Data":"43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000"} Feb 19 22:27:05 crc kubenswrapper[4795]: I0219 22:27:05.016892 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2rc7n" podStartSLOduration=2.584289957 podStartE2EDuration="5.016873221s" podCreationTimestamp="2026-02-19 22:27:00 +0000 UTC" firstStartedPulling="2026-02-19 22:27:01.962575703 +0000 UTC m=+3533.155093567" lastFinishedPulling="2026-02-19 22:27:04.395158947 +0000 UTC m=+3535.587676831" observedRunningTime="2026-02-19 22:27:05.012038023 +0000 UTC m=+3536.204555917" watchObservedRunningTime="2026-02-19 22:27:05.016873221 +0000 UTC m=+3536.209391095" Feb 19 22:27:11 crc kubenswrapper[4795]: I0219 22:27:11.189259 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:11 crc kubenswrapper[4795]: I0219 22:27:11.189660 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:11 crc kubenswrapper[4795]: I0219 22:27:11.254422 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:12 crc kubenswrapper[4795]: I0219 22:27:12.116520 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:12 crc kubenswrapper[4795]: I0219 22:27:12.166918 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2rc7n"] Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.066925 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2rc7n" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" containerName="registry-server" containerID="cri-o://43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000" gracePeriod=2 Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.445017 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.608767 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-utilities\") pod \"f690af90-79ff-44b5-88ad-970bfe721e55\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.609074 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55bbf\" (UniqueName: \"kubernetes.io/projected/f690af90-79ff-44b5-88ad-970bfe721e55-kube-api-access-55bbf\") pod \"f690af90-79ff-44b5-88ad-970bfe721e55\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.609117 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-catalog-content\") pod \"f690af90-79ff-44b5-88ad-970bfe721e55\" (UID: \"f690af90-79ff-44b5-88ad-970bfe721e55\") " Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.609636 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-utilities" (OuterVolumeSpecName: "utilities") pod "f690af90-79ff-44b5-88ad-970bfe721e55" (UID: "f690af90-79ff-44b5-88ad-970bfe721e55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.619337 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f690af90-79ff-44b5-88ad-970bfe721e55-kube-api-access-55bbf" (OuterVolumeSpecName: "kube-api-access-55bbf") pod "f690af90-79ff-44b5-88ad-970bfe721e55" (UID: "f690af90-79ff-44b5-88ad-970bfe721e55"). InnerVolumeSpecName "kube-api-access-55bbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.659861 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f690af90-79ff-44b5-88ad-970bfe721e55" (UID: "f690af90-79ff-44b5-88ad-970bfe721e55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.710815 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.710857 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55bbf\" (UniqueName: \"kubernetes.io/projected/f690af90-79ff-44b5-88ad-970bfe721e55-kube-api-access-55bbf\") on node \"crc\" DevicePath \"\"" Feb 19 22:27:14 crc kubenswrapper[4795]: I0219 22:27:14.710872 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f690af90-79ff-44b5-88ad-970bfe721e55-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.093860 4795 generic.go:334] "Generic (PLEG): container finished" podID="f690af90-79ff-44b5-88ad-970bfe721e55" containerID="43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000" exitCode=0 Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.093922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rc7n" event={"ID":"f690af90-79ff-44b5-88ad-970bfe721e55","Type":"ContainerDied","Data":"43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000"} Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.093947 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rc7n" event={"ID":"f690af90-79ff-44b5-88ad-970bfe721e55","Type":"ContainerDied","Data":"0f6c7a9b355c7198402cd2d227bffb002fc0b92346bf0bc97737e356a9a96ce1"} Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.093962 4795 scope.go:117] "RemoveContainer" containerID="43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.094079 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rc7n" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.122614 4795 scope.go:117] "RemoveContainer" containerID="dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.129865 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2rc7n"] Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.136010 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2rc7n"] Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.140176 4795 scope.go:117] "RemoveContainer" containerID="c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.159656 4795 scope.go:117] "RemoveContainer" containerID="43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000" Feb 19 22:27:15 crc kubenswrapper[4795]: E0219 22:27:15.160129 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000\": container with ID starting with 43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000 not found: ID does not exist" containerID="43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.160160 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000"} err="failed to get container status \"43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000\": rpc error: code = NotFound desc = could not find container \"43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000\": container with ID starting with 43460a0f9f63dd86db6b84af2f33b4a7517aa6852b9e2194fac2307459a54000 not found: ID does not exist" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.160202 4795 scope.go:117] "RemoveContainer" containerID="dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc" Feb 19 22:27:15 crc kubenswrapper[4795]: E0219 22:27:15.160427 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc\": container with ID starting with dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc not found: ID does not exist" containerID="dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.160456 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc"} err="failed to get container status \"dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc\": rpc error: code = NotFound desc = could not find container \"dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc\": container with ID starting with dc7aea0fef383654073c21717782d8c43a203053c552cbc8b640c0e1602c27dc not found: ID does not exist" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.160472 4795 scope.go:117] "RemoveContainer" containerID="c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee" Feb 19 22:27:15 crc kubenswrapper[4795]: E0219 22:27:15.160694 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee\": container with ID starting with c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee not found: ID does not exist" containerID="c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.160719 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee"} err="failed to get container status \"c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee\": rpc error: code = NotFound desc = could not find container \"c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee\": container with ID starting with c620ac7e95c13b534e0213c772c625adfacf036efad644b1e2751c5c052a4dee not found: ID does not exist" Feb 19 22:27:15 crc kubenswrapper[4795]: I0219 22:27:15.521875 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" path="/var/lib/kubelet/pods/f690af90-79ff-44b5-88ad-970bfe721e55/volumes" Feb 19 22:27:28 crc kubenswrapper[4795]: I0219 22:27:28.428020 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:27:28 crc kubenswrapper[4795]: I0219 22:27:28.429093 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:27:58 crc kubenswrapper[4795]: I0219 22:27:58.427325 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:27:58 crc kubenswrapper[4795]: I0219 22:27:58.428189 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:27:58 crc kubenswrapper[4795]: I0219 22:27:58.428267 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:27:58 crc kubenswrapper[4795]: I0219 22:27:58.429325 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:27:58 crc kubenswrapper[4795]: I0219 22:27:58.429445 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" gracePeriod=600 Feb 19 22:27:58 crc kubenswrapper[4795]: E0219 22:27:58.548074 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:27:59 crc kubenswrapper[4795]: I0219 22:27:59.428084 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" exitCode=0 Feb 19 22:27:59 crc kubenswrapper[4795]: I0219 22:27:59.428147 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f"} Feb 19 22:27:59 crc kubenswrapper[4795]: I0219 22:27:59.428218 4795 scope.go:117] "RemoveContainer" containerID="0edfa681896b7b4c4053886f43d282fa1e3dc3cde5816d95a6de237745fabdcb" Feb 19 22:27:59 crc kubenswrapper[4795]: I0219 22:27:59.428891 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:27:59 crc kubenswrapper[4795]: E0219 22:27:59.429219 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:28:13 crc kubenswrapper[4795]: I0219 22:28:13.511726 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:28:13 crc kubenswrapper[4795]: E0219 22:28:13.512660 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:28:25 crc kubenswrapper[4795]: I0219 22:28:25.512193 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:28:25 crc kubenswrapper[4795]: E0219 22:28:25.512941 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:28:37 crc kubenswrapper[4795]: I0219 22:28:37.512024 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:28:37 crc kubenswrapper[4795]: E0219 22:28:37.513029 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:28:49 crc kubenswrapper[4795]: I0219 22:28:49.519500 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:28:49 crc kubenswrapper[4795]: E0219 22:28:49.520569 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:29:04 crc kubenswrapper[4795]: I0219 22:29:04.512020 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:29:04 crc kubenswrapper[4795]: E0219 22:29:04.513038 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:29:16 crc kubenswrapper[4795]: I0219 22:29:16.513402 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:29:16 crc kubenswrapper[4795]: E0219 22:29:16.514426 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:29:31 crc kubenswrapper[4795]: I0219 22:29:31.511799 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:29:31 crc kubenswrapper[4795]: E0219 22:29:31.512874 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:29:43 crc kubenswrapper[4795]: I0219 22:29:43.512856 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:29:43 crc kubenswrapper[4795]: E0219 22:29:43.513473 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:29:58 crc kubenswrapper[4795]: I0219 22:29:58.511262 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:29:58 crc kubenswrapper[4795]: E0219 22:29:58.512400 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.181376 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr"] Feb 19 22:30:00 crc kubenswrapper[4795]: E0219 22:30:00.181812 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" containerName="extract-content" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.181829 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" containerName="extract-content" Feb 19 22:30:00 crc kubenswrapper[4795]: E0219 22:30:00.181845 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" containerName="registry-server" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.181852 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" containerName="registry-server" Feb 19 22:30:00 crc kubenswrapper[4795]: E0219 22:30:00.181867 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" containerName="extract-utilities" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.181879 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" containerName="extract-utilities" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.182046 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f690af90-79ff-44b5-88ad-970bfe721e55" containerName="registry-server" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.182837 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.185394 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.185915 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.187634 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr"] Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.232983 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8d7fc5a-2c38-45d1-92d4-e30329082e49-config-volume\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.233346 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8d7fc5a-2c38-45d1-92d4-e30329082e49-secret-volume\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.233490 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f62pp\" (UniqueName: \"kubernetes.io/projected/e8d7fc5a-2c38-45d1-92d4-e30329082e49-kube-api-access-f62pp\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.334883 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8d7fc5a-2c38-45d1-92d4-e30329082e49-config-volume\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.335210 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8d7fc5a-2c38-45d1-92d4-e30329082e49-secret-volume\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.335339 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f62pp\" (UniqueName: \"kubernetes.io/projected/e8d7fc5a-2c38-45d1-92d4-e30329082e49-kube-api-access-f62pp\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.335729 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8d7fc5a-2c38-45d1-92d4-e30329082e49-config-volume\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.345811 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8d7fc5a-2c38-45d1-92d4-e30329082e49-secret-volume\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.353055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f62pp\" (UniqueName: \"kubernetes.io/projected/e8d7fc5a-2c38-45d1-92d4-e30329082e49-kube-api-access-f62pp\") pod \"collect-profiles-29525670-bx5qr\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.509385 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:00 crc kubenswrapper[4795]: I0219 22:30:00.922602 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr"] Feb 19 22:30:00 crc kubenswrapper[4795]: W0219 22:30:00.934907 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8d7fc5a_2c38_45d1_92d4_e30329082e49.slice/crio-9050471d75442fed58e886865edb6229d98e97f463caeb47fb3ee9fd37f0c6ea WatchSource:0}: Error finding container 9050471d75442fed58e886865edb6229d98e97f463caeb47fb3ee9fd37f0c6ea: Status 404 returned error can't find the container with id 9050471d75442fed58e886865edb6229d98e97f463caeb47fb3ee9fd37f0c6ea Feb 19 22:30:01 crc kubenswrapper[4795]: I0219 22:30:01.356467 4795 generic.go:334] "Generic (PLEG): container finished" podID="e8d7fc5a-2c38-45d1-92d4-e30329082e49" containerID="2891e05af0080148a30c661d705a64987123339160913040e4d09a5170f489c1" exitCode=0 Feb 19 22:30:01 crc kubenswrapper[4795]: I0219 22:30:01.356511 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" event={"ID":"e8d7fc5a-2c38-45d1-92d4-e30329082e49","Type":"ContainerDied","Data":"2891e05af0080148a30c661d705a64987123339160913040e4d09a5170f489c1"} Feb 19 22:30:01 crc kubenswrapper[4795]: I0219 22:30:01.356806 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" event={"ID":"e8d7fc5a-2c38-45d1-92d4-e30329082e49","Type":"ContainerStarted","Data":"9050471d75442fed58e886865edb6229d98e97f463caeb47fb3ee9fd37f0c6ea"} Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.628840 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.766867 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8d7fc5a-2c38-45d1-92d4-e30329082e49-secret-volume\") pod \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.766946 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8d7fc5a-2c38-45d1-92d4-e30329082e49-config-volume\") pod \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.767052 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f62pp\" (UniqueName: \"kubernetes.io/projected/e8d7fc5a-2c38-45d1-92d4-e30329082e49-kube-api-access-f62pp\") pod \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\" (UID: \"e8d7fc5a-2c38-45d1-92d4-e30329082e49\") " Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.767681 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8d7fc5a-2c38-45d1-92d4-e30329082e49-config-volume" (OuterVolumeSpecName: "config-volume") pod "e8d7fc5a-2c38-45d1-92d4-e30329082e49" (UID: "e8d7fc5a-2c38-45d1-92d4-e30329082e49"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.771722 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8d7fc5a-2c38-45d1-92d4-e30329082e49-kube-api-access-f62pp" (OuterVolumeSpecName: "kube-api-access-f62pp") pod "e8d7fc5a-2c38-45d1-92d4-e30329082e49" (UID: "e8d7fc5a-2c38-45d1-92d4-e30329082e49"). InnerVolumeSpecName "kube-api-access-f62pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.771998 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d7fc5a-2c38-45d1-92d4-e30329082e49-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e8d7fc5a-2c38-45d1-92d4-e30329082e49" (UID: "e8d7fc5a-2c38-45d1-92d4-e30329082e49"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.869720 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8d7fc5a-2c38-45d1-92d4-e30329082e49-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.870029 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8d7fc5a-2c38-45d1-92d4-e30329082e49-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:30:02 crc kubenswrapper[4795]: I0219 22:30:02.870141 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f62pp\" (UniqueName: \"kubernetes.io/projected/e8d7fc5a-2c38-45d1-92d4-e30329082e49-kube-api-access-f62pp\") on node \"crc\" DevicePath \"\"" Feb 19 22:30:03 crc kubenswrapper[4795]: I0219 22:30:03.370316 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" event={"ID":"e8d7fc5a-2c38-45d1-92d4-e30329082e49","Type":"ContainerDied","Data":"9050471d75442fed58e886865edb6229d98e97f463caeb47fb3ee9fd37f0c6ea"} Feb 19 22:30:03 crc kubenswrapper[4795]: I0219 22:30:03.370377 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9050471d75442fed58e886865edb6229d98e97f463caeb47fb3ee9fd37f0c6ea" Feb 19 22:30:03 crc kubenswrapper[4795]: I0219 22:30:03.370441 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr" Feb 19 22:30:03 crc kubenswrapper[4795]: I0219 22:30:03.707667 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v"] Feb 19 22:30:03 crc kubenswrapper[4795]: I0219 22:30:03.717315 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525625-kzr2v"] Feb 19 22:30:05 crc kubenswrapper[4795]: I0219 22:30:05.523689 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1627c007-5a7c-4fa5-a15f-0da43560c849" path="/var/lib/kubelet/pods/1627c007-5a7c-4fa5-a15f-0da43560c849/volumes" Feb 19 22:30:13 crc kubenswrapper[4795]: I0219 22:30:13.512060 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:30:13 crc kubenswrapper[4795]: E0219 22:30:13.512906 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:30:14 crc kubenswrapper[4795]: I0219 22:30:14.417996 4795 scope.go:117] "RemoveContainer" containerID="6d51fcadf72cdef4da5baae1dcfd6cb93884c814c13eaf31041e3c1336f34a93" Feb 19 22:30:25 crc kubenswrapper[4795]: I0219 22:30:25.512679 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:30:25 crc kubenswrapper[4795]: E0219 22:30:25.513471 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:30:36 crc kubenswrapper[4795]: I0219 22:30:36.513096 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:30:36 crc kubenswrapper[4795]: E0219 22:30:36.514025 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:30:48 crc kubenswrapper[4795]: I0219 22:30:48.512751 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:30:48 crc kubenswrapper[4795]: E0219 22:30:48.513788 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:31:03 crc kubenswrapper[4795]: I0219 22:31:03.512587 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:31:03 crc kubenswrapper[4795]: E0219 22:31:03.513309 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:31:17 crc kubenswrapper[4795]: I0219 22:31:17.511379 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:31:17 crc kubenswrapper[4795]: E0219 22:31:17.511952 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:31:30 crc kubenswrapper[4795]: I0219 22:31:30.511408 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:31:30 crc kubenswrapper[4795]: E0219 22:31:30.512143 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:31:43 crc kubenswrapper[4795]: I0219 22:31:43.515426 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:31:43 crc kubenswrapper[4795]: E0219 22:31:43.516247 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:31:56 crc kubenswrapper[4795]: I0219 22:31:56.511930 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:31:56 crc kubenswrapper[4795]: E0219 22:31:56.512741 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.275488 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fjjbn"] Feb 19 22:32:09 crc kubenswrapper[4795]: E0219 22:32:09.276670 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d7fc5a-2c38-45d1-92d4-e30329082e49" containerName="collect-profiles" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.276695 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d7fc5a-2c38-45d1-92d4-e30329082e49" containerName="collect-profiles" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.277007 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d7fc5a-2c38-45d1-92d4-e30329082e49" containerName="collect-profiles" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.278899 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.299810 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fjjbn"] Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.340107 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-utilities\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.340278 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-catalog-content\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.340423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmzjs\" (UniqueName: \"kubernetes.io/projected/9216072e-17e9-4ae1-8bfb-52ce38a26f21-kube-api-access-cmzjs\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.483822 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmzjs\" (UniqueName: \"kubernetes.io/projected/9216072e-17e9-4ae1-8bfb-52ce38a26f21-kube-api-access-cmzjs\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.483898 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-utilities\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.483929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-catalog-content\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.484362 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-catalog-content\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.484623 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-utilities\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.507463 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmzjs\" (UniqueName: \"kubernetes.io/projected/9216072e-17e9-4ae1-8bfb-52ce38a26f21-kube-api-access-cmzjs\") pod \"redhat-marketplace-fjjbn\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:09 crc kubenswrapper[4795]: I0219 22:32:09.605374 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:10 crc kubenswrapper[4795]: I0219 22:32:10.086205 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fjjbn"] Feb 19 22:32:10 crc kubenswrapper[4795]: I0219 22:32:10.356008 4795 generic.go:334] "Generic (PLEG): container finished" podID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerID="5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa" exitCode=0 Feb 19 22:32:10 crc kubenswrapper[4795]: I0219 22:32:10.356087 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjjbn" event={"ID":"9216072e-17e9-4ae1-8bfb-52ce38a26f21","Type":"ContainerDied","Data":"5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa"} Feb 19 22:32:10 crc kubenswrapper[4795]: I0219 22:32:10.356338 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjjbn" event={"ID":"9216072e-17e9-4ae1-8bfb-52ce38a26f21","Type":"ContainerStarted","Data":"fd8dea2cc9816bec097755dd6dba91e5daa3982abf647afa038114a1324c1e97"} Feb 19 22:32:10 crc kubenswrapper[4795]: I0219 22:32:10.358883 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:32:10 crc kubenswrapper[4795]: I0219 22:32:10.512854 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:32:10 crc kubenswrapper[4795]: E0219 22:32:10.513248 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:32:11 crc kubenswrapper[4795]: I0219 22:32:11.367827 4795 generic.go:334] "Generic (PLEG): container finished" podID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerID="6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e" exitCode=0 Feb 19 22:32:11 crc kubenswrapper[4795]: I0219 22:32:11.367884 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjjbn" event={"ID":"9216072e-17e9-4ae1-8bfb-52ce38a26f21","Type":"ContainerDied","Data":"6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e"} Feb 19 22:32:12 crc kubenswrapper[4795]: I0219 22:32:12.376649 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjjbn" event={"ID":"9216072e-17e9-4ae1-8bfb-52ce38a26f21","Type":"ContainerStarted","Data":"1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe"} Feb 19 22:32:12 crc kubenswrapper[4795]: I0219 22:32:12.395383 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fjjbn" podStartSLOduration=1.9136564470000001 podStartE2EDuration="3.39536524s" podCreationTimestamp="2026-02-19 22:32:09 +0000 UTC" firstStartedPulling="2026-02-19 22:32:10.358499767 +0000 UTC m=+3841.551017661" lastFinishedPulling="2026-02-19 22:32:11.84020859 +0000 UTC m=+3843.032726454" observedRunningTime="2026-02-19 22:32:12.391426888 +0000 UTC m=+3843.583944752" watchObservedRunningTime="2026-02-19 22:32:12.39536524 +0000 UTC m=+3843.587883104" Feb 19 22:32:19 crc kubenswrapper[4795]: I0219 22:32:19.605955 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:19 crc kubenswrapper[4795]: I0219 22:32:19.606727 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:19 crc kubenswrapper[4795]: I0219 22:32:19.665576 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:20 crc kubenswrapper[4795]: I0219 22:32:20.495721 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:20 crc kubenswrapper[4795]: I0219 22:32:20.552058 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fjjbn"] Feb 19 22:32:22 crc kubenswrapper[4795]: I0219 22:32:22.444920 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fjjbn" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerName="registry-server" containerID="cri-o://1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe" gracePeriod=2 Feb 19 22:32:22 crc kubenswrapper[4795]: I0219 22:32:22.961655 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.105322 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmzjs\" (UniqueName: \"kubernetes.io/projected/9216072e-17e9-4ae1-8bfb-52ce38a26f21-kube-api-access-cmzjs\") pod \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.105376 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-utilities\") pod \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.105412 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-catalog-content\") pod \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\" (UID: \"9216072e-17e9-4ae1-8bfb-52ce38a26f21\") " Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.108080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-utilities" (OuterVolumeSpecName: "utilities") pod "9216072e-17e9-4ae1-8bfb-52ce38a26f21" (UID: "9216072e-17e9-4ae1-8bfb-52ce38a26f21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.112419 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9216072e-17e9-4ae1-8bfb-52ce38a26f21-kube-api-access-cmzjs" (OuterVolumeSpecName: "kube-api-access-cmzjs") pod "9216072e-17e9-4ae1-8bfb-52ce38a26f21" (UID: "9216072e-17e9-4ae1-8bfb-52ce38a26f21"). InnerVolumeSpecName "kube-api-access-cmzjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.132710 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9216072e-17e9-4ae1-8bfb-52ce38a26f21" (UID: "9216072e-17e9-4ae1-8bfb-52ce38a26f21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.207311 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmzjs\" (UniqueName: \"kubernetes.io/projected/9216072e-17e9-4ae1-8bfb-52ce38a26f21-kube-api-access-cmzjs\") on node \"crc\" DevicePath \"\"" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.207341 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.207351 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9216072e-17e9-4ae1-8bfb-52ce38a26f21-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.453276 4795 generic.go:334] "Generic (PLEG): container finished" podID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerID="1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe" exitCode=0 Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.453317 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjjbn" event={"ID":"9216072e-17e9-4ae1-8bfb-52ce38a26f21","Type":"ContainerDied","Data":"1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe"} Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.453340 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fjjbn" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.453365 4795 scope.go:117] "RemoveContainer" containerID="1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.453351 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fjjbn" event={"ID":"9216072e-17e9-4ae1-8bfb-52ce38a26f21","Type":"ContainerDied","Data":"fd8dea2cc9816bec097755dd6dba91e5daa3982abf647afa038114a1324c1e97"} Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.484461 4795 scope.go:117] "RemoveContainer" containerID="6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.491205 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fjjbn"] Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.498177 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fjjbn"] Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.512393 4795 scope.go:117] "RemoveContainer" containerID="5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.516622 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:32:23 crc kubenswrapper[4795]: E0219 22:32:23.517205 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.528447 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" path="/var/lib/kubelet/pods/9216072e-17e9-4ae1-8bfb-52ce38a26f21/volumes" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.548433 4795 scope.go:117] "RemoveContainer" containerID="1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe" Feb 19 22:32:23 crc kubenswrapper[4795]: E0219 22:32:23.548911 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe\": container with ID starting with 1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe not found: ID does not exist" containerID="1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.548938 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe"} err="failed to get container status \"1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe\": rpc error: code = NotFound desc = could not find container \"1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe\": container with ID starting with 1bffb3ce7c00123b1a58cad03352ac7ea4a5fd8034f223383885b43909c36dfe not found: ID does not exist" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.548959 4795 scope.go:117] "RemoveContainer" containerID="6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e" Feb 19 22:32:23 crc kubenswrapper[4795]: E0219 22:32:23.549300 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e\": container with ID starting with 6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e not found: ID does not exist" containerID="6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.549351 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e"} err="failed to get container status \"6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e\": rpc error: code = NotFound desc = could not find container \"6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e\": container with ID starting with 6c8e7b1979af7fb73cfd3bd6373f9f5a03eed0729d0d9ab939d133908788ce3e not found: ID does not exist" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.549377 4795 scope.go:117] "RemoveContainer" containerID="5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa" Feb 19 22:32:23 crc kubenswrapper[4795]: E0219 22:32:23.549653 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa\": container with ID starting with 5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa not found: ID does not exist" containerID="5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa" Feb 19 22:32:23 crc kubenswrapper[4795]: I0219 22:32:23.549677 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa"} err="failed to get container status \"5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa\": rpc error: code = NotFound desc = could not find container \"5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa\": container with ID starting with 5bc6d00036b8bfee908f4f769e91e7cd738a63d0c687f7e81771cab88749eafa not found: ID does not exist" Feb 19 22:32:38 crc kubenswrapper[4795]: I0219 22:32:38.511588 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:32:38 crc kubenswrapper[4795]: E0219 22:32:38.512559 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:32:49 crc kubenswrapper[4795]: I0219 22:32:49.516684 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:32:49 crc kubenswrapper[4795]: E0219 22:32:49.517473 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:33:03 crc kubenswrapper[4795]: I0219 22:33:03.512192 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:33:03 crc kubenswrapper[4795]: I0219 22:33:03.731276 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"a74e459fc04040e78e04a2f2c4ee65cfd25848c3c5a03bfbff2374c621904cd2"} Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.058022 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r4gjz"] Feb 19 22:35:02 crc kubenswrapper[4795]: E0219 22:35:02.059080 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerName="extract-content" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.059103 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerName="extract-content" Feb 19 22:35:02 crc kubenswrapper[4795]: E0219 22:35:02.059121 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerName="registry-server" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.059132 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerName="registry-server" Feb 19 22:35:02 crc kubenswrapper[4795]: E0219 22:35:02.059168 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerName="extract-utilities" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.059203 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerName="extract-utilities" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.059445 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9216072e-17e9-4ae1-8bfb-52ce38a26f21" containerName="registry-server" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.062164 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.083504 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r4gjz"] Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.151651 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-catalog-content\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.151736 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-utilities\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.151769 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rntp4\" (UniqueName: \"kubernetes.io/projected/edef5238-d122-4fb5-9078-fff0c0e423af-kube-api-access-rntp4\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.252793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-catalog-content\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.252940 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-utilities\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.253128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rntp4\" (UniqueName: \"kubernetes.io/projected/edef5238-d122-4fb5-9078-fff0c0e423af-kube-api-access-rntp4\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.253620 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-catalog-content\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.253772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-utilities\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.302476 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rntp4\" (UniqueName: \"kubernetes.io/projected/edef5238-d122-4fb5-9078-fff0c0e423af-kube-api-access-rntp4\") pod \"redhat-operators-r4gjz\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.413115 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:02 crc kubenswrapper[4795]: I0219 22:35:02.835918 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r4gjz"] Feb 19 22:35:03 crc kubenswrapper[4795]: I0219 22:35:03.004435 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4gjz" event={"ID":"edef5238-d122-4fb5-9078-fff0c0e423af","Type":"ContainerStarted","Data":"3fcf80fbd39f147a966a81e1f88e1f1ddc485406b83cb240cf7cd273c36899cc"} Feb 19 22:35:04 crc kubenswrapper[4795]: I0219 22:35:04.019619 4795 generic.go:334] "Generic (PLEG): container finished" podID="edef5238-d122-4fb5-9078-fff0c0e423af" containerID="0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126" exitCode=0 Feb 19 22:35:04 crc kubenswrapper[4795]: I0219 22:35:04.019715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4gjz" event={"ID":"edef5238-d122-4fb5-9078-fff0c0e423af","Type":"ContainerDied","Data":"0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126"} Feb 19 22:35:06 crc kubenswrapper[4795]: I0219 22:35:06.037894 4795 generic.go:334] "Generic (PLEG): container finished" podID="edef5238-d122-4fb5-9078-fff0c0e423af" containerID="1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555" exitCode=0 Feb 19 22:35:06 crc kubenswrapper[4795]: I0219 22:35:06.037984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4gjz" event={"ID":"edef5238-d122-4fb5-9078-fff0c0e423af","Type":"ContainerDied","Data":"1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555"} Feb 19 22:35:07 crc kubenswrapper[4795]: I0219 22:35:07.047371 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4gjz" event={"ID":"edef5238-d122-4fb5-9078-fff0c0e423af","Type":"ContainerStarted","Data":"99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d"} Feb 19 22:35:07 crc kubenswrapper[4795]: I0219 22:35:07.078307 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r4gjz" podStartSLOduration=2.577351891 podStartE2EDuration="5.078282424s" podCreationTimestamp="2026-02-19 22:35:02 +0000 UTC" firstStartedPulling="2026-02-19 22:35:04.022064247 +0000 UTC m=+4015.214582121" lastFinishedPulling="2026-02-19 22:35:06.52299475 +0000 UTC m=+4017.715512654" observedRunningTime="2026-02-19 22:35:07.071434299 +0000 UTC m=+4018.263952183" watchObservedRunningTime="2026-02-19 22:35:07.078282424 +0000 UTC m=+4018.270800328" Feb 19 22:35:12 crc kubenswrapper[4795]: I0219 22:35:12.414084 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:12 crc kubenswrapper[4795]: I0219 22:35:12.414852 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:13 crc kubenswrapper[4795]: I0219 22:35:13.458371 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r4gjz" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="registry-server" probeResult="failure" output=< Feb 19 22:35:13 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 22:35:13 crc kubenswrapper[4795]: > Feb 19 22:35:22 crc kubenswrapper[4795]: I0219 22:35:22.478972 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:22 crc kubenswrapper[4795]: I0219 22:35:22.553154 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:22 crc kubenswrapper[4795]: I0219 22:35:22.729288 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r4gjz"] Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.200040 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r4gjz" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="registry-server" containerID="cri-o://99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d" gracePeriod=2 Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.714351 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.818084 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-catalog-content\") pod \"edef5238-d122-4fb5-9078-fff0c0e423af\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.818218 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-utilities\") pod \"edef5238-d122-4fb5-9078-fff0c0e423af\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.818321 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rntp4\" (UniqueName: \"kubernetes.io/projected/edef5238-d122-4fb5-9078-fff0c0e423af-kube-api-access-rntp4\") pod \"edef5238-d122-4fb5-9078-fff0c0e423af\" (UID: \"edef5238-d122-4fb5-9078-fff0c0e423af\") " Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.819237 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-utilities" (OuterVolumeSpecName: "utilities") pod "edef5238-d122-4fb5-9078-fff0c0e423af" (UID: "edef5238-d122-4fb5-9078-fff0c0e423af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.824408 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edef5238-d122-4fb5-9078-fff0c0e423af-kube-api-access-rntp4" (OuterVolumeSpecName: "kube-api-access-rntp4") pod "edef5238-d122-4fb5-9078-fff0c0e423af" (UID: "edef5238-d122-4fb5-9078-fff0c0e423af"). InnerVolumeSpecName "kube-api-access-rntp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.920220 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.920249 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rntp4\" (UniqueName: \"kubernetes.io/projected/edef5238-d122-4fb5-9078-fff0c0e423af-kube-api-access-rntp4\") on node \"crc\" DevicePath \"\"" Feb 19 22:35:24 crc kubenswrapper[4795]: I0219 22:35:24.968949 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edef5238-d122-4fb5-9078-fff0c0e423af" (UID: "edef5238-d122-4fb5-9078-fff0c0e423af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.021616 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef5238-d122-4fb5-9078-fff0c0e423af-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.216634 4795 generic.go:334] "Generic (PLEG): container finished" podID="edef5238-d122-4fb5-9078-fff0c0e423af" containerID="99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d" exitCode=0 Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.216679 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4gjz" event={"ID":"edef5238-d122-4fb5-9078-fff0c0e423af","Type":"ContainerDied","Data":"99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d"} Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.216704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4gjz" event={"ID":"edef5238-d122-4fb5-9078-fff0c0e423af","Type":"ContainerDied","Data":"3fcf80fbd39f147a966a81e1f88e1f1ddc485406b83cb240cf7cd273c36899cc"} Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.216721 4795 scope.go:117] "RemoveContainer" containerID="99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.218796 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r4gjz" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.243052 4795 scope.go:117] "RemoveContainer" containerID="1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.266024 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r4gjz"] Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.272088 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r4gjz"] Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.283054 4795 scope.go:117] "RemoveContainer" containerID="0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.298716 4795 scope.go:117] "RemoveContainer" containerID="99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d" Feb 19 22:35:25 crc kubenswrapper[4795]: E0219 22:35:25.299254 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d\": container with ID starting with 99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d not found: ID does not exist" containerID="99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.299301 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d"} err="failed to get container status \"99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d\": rpc error: code = NotFound desc = could not find container \"99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d\": container with ID starting with 99749496347c139653a0cef67ba667830ae0233aba951d7ee58fd92b6d533d9d not found: ID does not exist" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.299343 4795 scope.go:117] "RemoveContainer" containerID="1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555" Feb 19 22:35:25 crc kubenswrapper[4795]: E0219 22:35:25.299637 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555\": container with ID starting with 1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555 not found: ID does not exist" containerID="1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.299671 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555"} err="failed to get container status \"1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555\": rpc error: code = NotFound desc = could not find container \"1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555\": container with ID starting with 1cb4fdcd7a5581f2998a483d346d5f9609da52abee6197af388aa092aece9555 not found: ID does not exist" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.299690 4795 scope.go:117] "RemoveContainer" containerID="0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126" Feb 19 22:35:25 crc kubenswrapper[4795]: E0219 22:35:25.300235 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126\": container with ID starting with 0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126 not found: ID does not exist" containerID="0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.300282 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126"} err="failed to get container status \"0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126\": rpc error: code = NotFound desc = could not find container \"0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126\": container with ID starting with 0970a85b2508fe227d301d30a6474254a87473f0d7fba1d8e43c1a7590508126 not found: ID does not exist" Feb 19 22:35:25 crc kubenswrapper[4795]: I0219 22:35:25.524801 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" path="/var/lib/kubelet/pods/edef5238-d122-4fb5-9078-fff0c0e423af/volumes" Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.427570 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.427937 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.940405 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6cn6h"] Feb 19 22:35:28 crc kubenswrapper[4795]: E0219 22:35:28.941335 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="extract-utilities" Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.941362 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="extract-utilities" Feb 19 22:35:28 crc kubenswrapper[4795]: E0219 22:35:28.941396 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="registry-server" Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.941409 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="registry-server" Feb 19 22:35:28 crc kubenswrapper[4795]: E0219 22:35:28.941426 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="extract-content" Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.941439 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="extract-content" Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.941684 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="edef5238-d122-4fb5-9078-fff0c0e423af" containerName="registry-server" Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.944928 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:28 crc kubenswrapper[4795]: I0219 22:35:28.962770 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6cn6h"] Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.082720 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-catalog-content\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.082767 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-utilities\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.082806 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b7gn\" (UniqueName: \"kubernetes.io/projected/677da4f8-3439-41c1-b491-46ad22cf8f99-kube-api-access-8b7gn\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.184027 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-catalog-content\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.184077 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-utilities\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.184121 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b7gn\" (UniqueName: \"kubernetes.io/projected/677da4f8-3439-41c1-b491-46ad22cf8f99-kube-api-access-8b7gn\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.184494 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-catalog-content\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.184546 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-utilities\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.210671 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b7gn\" (UniqueName: \"kubernetes.io/projected/677da4f8-3439-41c1-b491-46ad22cf8f99-kube-api-access-8b7gn\") pod \"certified-operators-6cn6h\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.290420 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:29 crc kubenswrapper[4795]: I0219 22:35:29.614232 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6cn6h"] Feb 19 22:35:30 crc kubenswrapper[4795]: I0219 22:35:30.261121 4795 generic.go:334] "Generic (PLEG): container finished" podID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerID="3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740" exitCode=0 Feb 19 22:35:30 crc kubenswrapper[4795]: I0219 22:35:30.261209 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cn6h" event={"ID":"677da4f8-3439-41c1-b491-46ad22cf8f99","Type":"ContainerDied","Data":"3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740"} Feb 19 22:35:30 crc kubenswrapper[4795]: I0219 22:35:30.261953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cn6h" event={"ID":"677da4f8-3439-41c1-b491-46ad22cf8f99","Type":"ContainerStarted","Data":"f25591d70490f0645bf4d9c7ca437387d9cbf3e1178df39d65c74bd44551e746"} Feb 19 22:35:31 crc kubenswrapper[4795]: I0219 22:35:31.274783 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cn6h" event={"ID":"677da4f8-3439-41c1-b491-46ad22cf8f99","Type":"ContainerStarted","Data":"37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd"} Feb 19 22:35:32 crc kubenswrapper[4795]: I0219 22:35:32.283220 4795 generic.go:334] "Generic (PLEG): container finished" podID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerID="37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd" exitCode=0 Feb 19 22:35:32 crc kubenswrapper[4795]: I0219 22:35:32.283349 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cn6h" event={"ID":"677da4f8-3439-41c1-b491-46ad22cf8f99","Type":"ContainerDied","Data":"37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd"} Feb 19 22:35:32 crc kubenswrapper[4795]: I0219 22:35:32.283690 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cn6h" event={"ID":"677da4f8-3439-41c1-b491-46ad22cf8f99","Type":"ContainerStarted","Data":"a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8"} Feb 19 22:35:32 crc kubenswrapper[4795]: I0219 22:35:32.304958 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6cn6h" podStartSLOduration=2.916744929 podStartE2EDuration="4.304934872s" podCreationTimestamp="2026-02-19 22:35:28 +0000 UTC" firstStartedPulling="2026-02-19 22:35:30.264838097 +0000 UTC m=+4041.457355971" lastFinishedPulling="2026-02-19 22:35:31.65302805 +0000 UTC m=+4042.845545914" observedRunningTime="2026-02-19 22:35:32.300923297 +0000 UTC m=+4043.493441231" watchObservedRunningTime="2026-02-19 22:35:32.304934872 +0000 UTC m=+4043.497452736" Feb 19 22:35:39 crc kubenswrapper[4795]: I0219 22:35:39.290953 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:39 crc kubenswrapper[4795]: I0219 22:35:39.291672 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:39 crc kubenswrapper[4795]: I0219 22:35:39.357478 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:39 crc kubenswrapper[4795]: I0219 22:35:39.415967 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:39 crc kubenswrapper[4795]: I0219 22:35:39.592679 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6cn6h"] Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.368940 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6cn6h" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerName="registry-server" containerID="cri-o://a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8" gracePeriod=2 Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.822781 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.884284 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-catalog-content\") pod \"677da4f8-3439-41c1-b491-46ad22cf8f99\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.884381 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b7gn\" (UniqueName: \"kubernetes.io/projected/677da4f8-3439-41c1-b491-46ad22cf8f99-kube-api-access-8b7gn\") pod \"677da4f8-3439-41c1-b491-46ad22cf8f99\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.884458 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-utilities\") pod \"677da4f8-3439-41c1-b491-46ad22cf8f99\" (UID: \"677da4f8-3439-41c1-b491-46ad22cf8f99\") " Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.885438 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-utilities" (OuterVolumeSpecName: "utilities") pod "677da4f8-3439-41c1-b491-46ad22cf8f99" (UID: "677da4f8-3439-41c1-b491-46ad22cf8f99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.889022 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/677da4f8-3439-41c1-b491-46ad22cf8f99-kube-api-access-8b7gn" (OuterVolumeSpecName: "kube-api-access-8b7gn") pod "677da4f8-3439-41c1-b491-46ad22cf8f99" (UID: "677da4f8-3439-41c1-b491-46ad22cf8f99"). InnerVolumeSpecName "kube-api-access-8b7gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.955948 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "677da4f8-3439-41c1-b491-46ad22cf8f99" (UID: "677da4f8-3439-41c1-b491-46ad22cf8f99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.986023 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.986057 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b7gn\" (UniqueName: \"kubernetes.io/projected/677da4f8-3439-41c1-b491-46ad22cf8f99-kube-api-access-8b7gn\") on node \"crc\" DevicePath \"\"" Feb 19 22:35:41 crc kubenswrapper[4795]: I0219 22:35:41.986092 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/677da4f8-3439-41c1-b491-46ad22cf8f99-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.385444 4795 generic.go:334] "Generic (PLEG): container finished" podID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerID="a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8" exitCode=0 Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.385532 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6cn6h" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.385568 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cn6h" event={"ID":"677da4f8-3439-41c1-b491-46ad22cf8f99","Type":"ContainerDied","Data":"a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8"} Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.386071 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6cn6h" event={"ID":"677da4f8-3439-41c1-b491-46ad22cf8f99","Type":"ContainerDied","Data":"f25591d70490f0645bf4d9c7ca437387d9cbf3e1178df39d65c74bd44551e746"} Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.386113 4795 scope.go:117] "RemoveContainer" containerID="a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.411473 4795 scope.go:117] "RemoveContainer" containerID="37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.436387 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6cn6h"] Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.443855 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6cn6h"] Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.453487 4795 scope.go:117] "RemoveContainer" containerID="3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.477834 4795 scope.go:117] "RemoveContainer" containerID="a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8" Feb 19 22:35:42 crc kubenswrapper[4795]: E0219 22:35:42.478236 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8\": container with ID starting with a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8 not found: ID does not exist" containerID="a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.478268 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8"} err="failed to get container status \"a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8\": rpc error: code = NotFound desc = could not find container \"a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8\": container with ID starting with a92b235567e0d9b79f3b4147a93137b692107ed94c5afdda79b2c5695de8aaf8 not found: ID does not exist" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.478291 4795 scope.go:117] "RemoveContainer" containerID="37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd" Feb 19 22:35:42 crc kubenswrapper[4795]: E0219 22:35:42.478783 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd\": container with ID starting with 37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd not found: ID does not exist" containerID="37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.478837 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd"} err="failed to get container status \"37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd\": rpc error: code = NotFound desc = could not find container \"37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd\": container with ID starting with 37923f399ed409a6ab7ef3d7baef4da55b6aa22861de0d066563c695efa48dfd not found: ID does not exist" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.478879 4795 scope.go:117] "RemoveContainer" containerID="3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740" Feb 19 22:35:42 crc kubenswrapper[4795]: E0219 22:35:42.479251 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740\": container with ID starting with 3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740 not found: ID does not exist" containerID="3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740" Feb 19 22:35:42 crc kubenswrapper[4795]: I0219 22:35:42.479285 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740"} err="failed to get container status \"3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740\": rpc error: code = NotFound desc = could not find container \"3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740\": container with ID starting with 3f36ea63387d19c31183daecf26693f1c9928e033424c3ff71d7ea7db5f54740 not found: ID does not exist" Feb 19 22:35:43 crc kubenswrapper[4795]: I0219 22:35:43.530868 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" path="/var/lib/kubelet/pods/677da4f8-3439-41c1-b491-46ad22cf8f99/volumes" Feb 19 22:35:58 crc kubenswrapper[4795]: I0219 22:35:58.427217 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:35:58 crc kubenswrapper[4795]: I0219 22:35:58.427915 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.427858 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.428489 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.428549 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.429227 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a74e459fc04040e78e04a2f2c4ee65cfd25848c3c5a03bfbff2374c621904cd2"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.429317 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://a74e459fc04040e78e04a2f2c4ee65cfd25848c3c5a03bfbff2374c621904cd2" gracePeriod=600 Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.772721 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="a74e459fc04040e78e04a2f2c4ee65cfd25848c3c5a03bfbff2374c621904cd2" exitCode=0 Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.772786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"a74e459fc04040e78e04a2f2c4ee65cfd25848c3c5a03bfbff2374c621904cd2"} Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.773101 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799"} Feb 19 22:36:28 crc kubenswrapper[4795]: I0219 22:36:28.773125 4795 scope.go:117] "RemoveContainer" containerID="9c63131d970040b18f3ed947da107f77f33be5db6a5f5930540c14c137bd2e9f" Feb 19 22:37:24 crc kubenswrapper[4795]: I0219 22:37:24.927668 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fhss8"] Feb 19 22:37:24 crc kubenswrapper[4795]: E0219 22:37:24.928631 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerName="extract-utilities" Feb 19 22:37:24 crc kubenswrapper[4795]: I0219 22:37:24.928651 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerName="extract-utilities" Feb 19 22:37:24 crc kubenswrapper[4795]: E0219 22:37:24.928667 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerName="extract-content" Feb 19 22:37:24 crc kubenswrapper[4795]: I0219 22:37:24.928675 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerName="extract-content" Feb 19 22:37:24 crc kubenswrapper[4795]: E0219 22:37:24.928689 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerName="registry-server" Feb 19 22:37:24 crc kubenswrapper[4795]: I0219 22:37:24.928699 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerName="registry-server" Feb 19 22:37:24 crc kubenswrapper[4795]: I0219 22:37:24.928884 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="677da4f8-3439-41c1-b491-46ad22cf8f99" containerName="registry-server" Feb 19 22:37:24 crc kubenswrapper[4795]: I0219 22:37:24.930192 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:24 crc kubenswrapper[4795]: I0219 22:37:24.979279 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhss8"] Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.063010 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-catalog-content\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.063079 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9249\" (UniqueName: \"kubernetes.io/projected/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-kube-api-access-w9249\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.063555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-utilities\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.165451 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-utilities\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.165552 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-catalog-content\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.165590 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9249\" (UniqueName: \"kubernetes.io/projected/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-kube-api-access-w9249\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.165964 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-utilities\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.166209 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-catalog-content\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.184228 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9249\" (UniqueName: \"kubernetes.io/projected/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-kube-api-access-w9249\") pod \"community-operators-fhss8\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.310218 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:25 crc kubenswrapper[4795]: I0219 22:37:25.757613 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhss8"] Feb 19 22:37:25 crc kubenswrapper[4795]: W0219 22:37:25.762358 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fb7d96a_dce2_4dba_a43e_1fba3ebcb3ff.slice/crio-c0811f6d4eee4f68dfb1152548dce3ed43e0601930423a4305095732e7baa753 WatchSource:0}: Error finding container c0811f6d4eee4f68dfb1152548dce3ed43e0601930423a4305095732e7baa753: Status 404 returned error can't find the container with id c0811f6d4eee4f68dfb1152548dce3ed43e0601930423a4305095732e7baa753 Feb 19 22:37:26 crc kubenswrapper[4795]: I0219 22:37:26.246008 4795 generic.go:334] "Generic (PLEG): container finished" podID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerID="8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa" exitCode=0 Feb 19 22:37:26 crc kubenswrapper[4795]: I0219 22:37:26.246091 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhss8" event={"ID":"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff","Type":"ContainerDied","Data":"8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa"} Feb 19 22:37:26 crc kubenswrapper[4795]: I0219 22:37:26.246330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhss8" event={"ID":"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff","Type":"ContainerStarted","Data":"c0811f6d4eee4f68dfb1152548dce3ed43e0601930423a4305095732e7baa753"} Feb 19 22:37:26 crc kubenswrapper[4795]: I0219 22:37:26.248564 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:37:28 crc kubenswrapper[4795]: I0219 22:37:28.266753 4795 generic.go:334] "Generic (PLEG): container finished" podID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerID="8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c" exitCode=0 Feb 19 22:37:28 crc kubenswrapper[4795]: I0219 22:37:28.266785 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhss8" event={"ID":"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff","Type":"ContainerDied","Data":"8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c"} Feb 19 22:37:29 crc kubenswrapper[4795]: I0219 22:37:29.273740 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhss8" event={"ID":"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff","Type":"ContainerStarted","Data":"5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977"} Feb 19 22:37:29 crc kubenswrapper[4795]: I0219 22:37:29.288585 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fhss8" podStartSLOduration=2.862673698 podStartE2EDuration="5.288569279s" podCreationTimestamp="2026-02-19 22:37:24 +0000 UTC" firstStartedPulling="2026-02-19 22:37:26.248160173 +0000 UTC m=+4157.440678067" lastFinishedPulling="2026-02-19 22:37:28.674055784 +0000 UTC m=+4159.866573648" observedRunningTime="2026-02-19 22:37:29.287139828 +0000 UTC m=+4160.479657692" watchObservedRunningTime="2026-02-19 22:37:29.288569279 +0000 UTC m=+4160.481087143" Feb 19 22:37:35 crc kubenswrapper[4795]: I0219 22:37:35.311223 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:35 crc kubenswrapper[4795]: I0219 22:37:35.311795 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:35 crc kubenswrapper[4795]: I0219 22:37:35.364824 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:36 crc kubenswrapper[4795]: I0219 22:37:36.367416 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:36 crc kubenswrapper[4795]: I0219 22:37:36.408559 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fhss8"] Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.330690 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fhss8" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerName="registry-server" containerID="cri-o://5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977" gracePeriod=2 Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.779000 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.870840 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-utilities\") pod \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.870914 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-catalog-content\") pod \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.870942 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9249\" (UniqueName: \"kubernetes.io/projected/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-kube-api-access-w9249\") pod \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\" (UID: \"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff\") " Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.871893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-utilities" (OuterVolumeSpecName: "utilities") pod "1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" (UID: "1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.879422 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-kube-api-access-w9249" (OuterVolumeSpecName: "kube-api-access-w9249") pod "1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" (UID: "1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff"). InnerVolumeSpecName "kube-api-access-w9249". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.945529 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" (UID: "1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.971687 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.971720 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9249\" (UniqueName: \"kubernetes.io/projected/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-kube-api-access-w9249\") on node \"crc\" DevicePath \"\"" Feb 19 22:37:38 crc kubenswrapper[4795]: I0219 22:37:38.971729 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.341053 4795 generic.go:334] "Generic (PLEG): container finished" podID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerID="5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977" exitCode=0 Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.341124 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhss8" event={"ID":"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff","Type":"ContainerDied","Data":"5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977"} Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.341248 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhss8" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.342367 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhss8" event={"ID":"1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff","Type":"ContainerDied","Data":"c0811f6d4eee4f68dfb1152548dce3ed43e0601930423a4305095732e7baa753"} Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.342395 4795 scope.go:117] "RemoveContainer" containerID="5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.362914 4795 scope.go:117] "RemoveContainer" containerID="8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.400625 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fhss8"] Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.407730 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fhss8"] Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.407910 4795 scope.go:117] "RemoveContainer" containerID="8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.441752 4795 scope.go:117] "RemoveContainer" containerID="5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977" Feb 19 22:37:39 crc kubenswrapper[4795]: E0219 22:37:39.442346 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977\": container with ID starting with 5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977 not found: ID does not exist" containerID="5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.442396 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977"} err="failed to get container status \"5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977\": rpc error: code = NotFound desc = could not find container \"5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977\": container with ID starting with 5cd114c1fb2fb95c62c168b055daf3c378e71e5edd128a3284f05a448914a977 not found: ID does not exist" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.442423 4795 scope.go:117] "RemoveContainer" containerID="8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c" Feb 19 22:37:39 crc kubenswrapper[4795]: E0219 22:37:39.443311 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c\": container with ID starting with 8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c not found: ID does not exist" containerID="8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.443377 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c"} err="failed to get container status \"8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c\": rpc error: code = NotFound desc = could not find container \"8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c\": container with ID starting with 8e32f661daf3f3a36e8a7c04217754b0e696dd7557faca9ab17e09c9eabc3e7c not found: ID does not exist" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.443627 4795 scope.go:117] "RemoveContainer" containerID="8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa" Feb 19 22:37:39 crc kubenswrapper[4795]: E0219 22:37:39.443985 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa\": container with ID starting with 8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa not found: ID does not exist" containerID="8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.444022 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa"} err="failed to get container status \"8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa\": rpc error: code = NotFound desc = could not find container \"8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa\": container with ID starting with 8ab7d51f60e94e755f5efe3ea3c083c30640eb7d75d5ad1c4fcb8e6ac37739fa not found: ID does not exist" Feb 19 22:37:39 crc kubenswrapper[4795]: I0219 22:37:39.523324 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" path="/var/lib/kubelet/pods/1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff/volumes" Feb 19 22:38:28 crc kubenswrapper[4795]: I0219 22:38:28.428247 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:38:28 crc kubenswrapper[4795]: I0219 22:38:28.428809 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:38:58 crc kubenswrapper[4795]: I0219 22:38:58.427417 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:38:58 crc kubenswrapper[4795]: I0219 22:38:58.428064 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:39:28 crc kubenswrapper[4795]: I0219 22:39:28.427133 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:39:28 crc kubenswrapper[4795]: I0219 22:39:28.427609 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:39:28 crc kubenswrapper[4795]: I0219 22:39:28.427660 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:39:28 crc kubenswrapper[4795]: I0219 22:39:28.428266 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:39:28 crc kubenswrapper[4795]: I0219 22:39:28.428319 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" gracePeriod=600 Feb 19 22:39:28 crc kubenswrapper[4795]: E0219 22:39:28.558120 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:39:29 crc kubenswrapper[4795]: I0219 22:39:29.301133 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" exitCode=0 Feb 19 22:39:29 crc kubenswrapper[4795]: I0219 22:39:29.301184 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799"} Feb 19 22:39:29 crc kubenswrapper[4795]: I0219 22:39:29.301596 4795 scope.go:117] "RemoveContainer" containerID="a74e459fc04040e78e04a2f2c4ee65cfd25848c3c5a03bfbff2374c621904cd2" Feb 19 22:39:29 crc kubenswrapper[4795]: I0219 22:39:29.301995 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:39:29 crc kubenswrapper[4795]: E0219 22:39:29.302207 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:39:44 crc kubenswrapper[4795]: I0219 22:39:44.513122 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:39:44 crc kubenswrapper[4795]: E0219 22:39:44.514603 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:39:59 crc kubenswrapper[4795]: I0219 22:39:59.520223 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:39:59 crc kubenswrapper[4795]: E0219 22:39:59.521365 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:40:11 crc kubenswrapper[4795]: I0219 22:40:11.512073 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:40:11 crc kubenswrapper[4795]: E0219 22:40:11.513410 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:40:23 crc kubenswrapper[4795]: I0219 22:40:23.512365 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:40:23 crc kubenswrapper[4795]: E0219 22:40:23.513202 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:40:37 crc kubenswrapper[4795]: I0219 22:40:37.511969 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:40:37 crc kubenswrapper[4795]: E0219 22:40:37.512994 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:40:49 crc kubenswrapper[4795]: I0219 22:40:49.512694 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:40:49 crc kubenswrapper[4795]: E0219 22:40:49.515788 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:41:00 crc kubenswrapper[4795]: I0219 22:41:00.511903 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:41:00 crc kubenswrapper[4795]: E0219 22:41:00.512827 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.829441 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-6tsln"] Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.835068 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-6tsln"] Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.992699 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-dpkxw"] Feb 19 22:41:04 crc kubenswrapper[4795]: E0219 22:41:04.993040 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerName="registry-server" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.993061 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerName="registry-server" Feb 19 22:41:04 crc kubenswrapper[4795]: E0219 22:41:04.993085 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerName="extract-content" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.993093 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerName="extract-content" Feb 19 22:41:04 crc kubenswrapper[4795]: E0219 22:41:04.993122 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerName="extract-utilities" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.993131 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerName="extract-utilities" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.993310 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fb7d96a-dce2-4dba-a43e-1fba3ebcb3ff" containerName="registry-server" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.993881 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.996950 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.996979 4795 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-4rmf9" Feb 19 22:41:04 crc kubenswrapper[4795]: I0219 22:41:04.997522 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.001597 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-dpkxw"] Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.003759 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.135479 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e61919b-4848-43ec-8a16-6d752a04c5ac-node-mnt\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.135620 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flhx7\" (UniqueName: \"kubernetes.io/projected/5e61919b-4848-43ec-8a16-6d752a04c5ac-kube-api-access-flhx7\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.135673 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e61919b-4848-43ec-8a16-6d752a04c5ac-crc-storage\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.236460 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e61919b-4848-43ec-8a16-6d752a04c5ac-crc-storage\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.236520 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e61919b-4848-43ec-8a16-6d752a04c5ac-node-mnt\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.236576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flhx7\" (UniqueName: \"kubernetes.io/projected/5e61919b-4848-43ec-8a16-6d752a04c5ac-kube-api-access-flhx7\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.237444 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e61919b-4848-43ec-8a16-6d752a04c5ac-crc-storage\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.237696 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e61919b-4848-43ec-8a16-6d752a04c5ac-node-mnt\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.263642 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flhx7\" (UniqueName: \"kubernetes.io/projected/5e61919b-4848-43ec-8a16-6d752a04c5ac-kube-api-access-flhx7\") pod \"crc-storage-crc-dpkxw\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.330898 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.525036 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc847694-39ea-4c3c-bb58-0f920e59ac62" path="/var/lib/kubelet/pods/dc847694-39ea-4c3c-bb58-0f920e59ac62/volumes" Feb 19 22:41:05 crc kubenswrapper[4795]: I0219 22:41:05.782749 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-dpkxw"] Feb 19 22:41:06 crc kubenswrapper[4795]: I0219 22:41:06.126412 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dpkxw" event={"ID":"5e61919b-4848-43ec-8a16-6d752a04c5ac","Type":"ContainerStarted","Data":"2ec619e5b6188cbce687e8d5092eede0d3323f72a885aad70e24f71d39f7a75b"} Feb 19 22:41:07 crc kubenswrapper[4795]: I0219 22:41:07.133597 4795 generic.go:334] "Generic (PLEG): container finished" podID="5e61919b-4848-43ec-8a16-6d752a04c5ac" containerID="fa2e3c7da6ac02ed85489de382e40a9b438bc98edd6481546fefa8394b7e5fce" exitCode=0 Feb 19 22:41:07 crc kubenswrapper[4795]: I0219 22:41:07.133686 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dpkxw" event={"ID":"5e61919b-4848-43ec-8a16-6d752a04c5ac","Type":"ContainerDied","Data":"fa2e3c7da6ac02ed85489de382e40a9b438bc98edd6481546fefa8394b7e5fce"} Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.412515 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.480537 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flhx7\" (UniqueName: \"kubernetes.io/projected/5e61919b-4848-43ec-8a16-6d752a04c5ac-kube-api-access-flhx7\") pod \"5e61919b-4848-43ec-8a16-6d752a04c5ac\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.480706 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e61919b-4848-43ec-8a16-6d752a04c5ac-node-mnt\") pod \"5e61919b-4848-43ec-8a16-6d752a04c5ac\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.480795 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e61919b-4848-43ec-8a16-6d752a04c5ac-crc-storage\") pod \"5e61919b-4848-43ec-8a16-6d752a04c5ac\" (UID: \"5e61919b-4848-43ec-8a16-6d752a04c5ac\") " Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.480820 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e61919b-4848-43ec-8a16-6d752a04c5ac-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "5e61919b-4848-43ec-8a16-6d752a04c5ac" (UID: "5e61919b-4848-43ec-8a16-6d752a04c5ac"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.481238 4795 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5e61919b-4848-43ec-8a16-6d752a04c5ac-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.485621 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e61919b-4848-43ec-8a16-6d752a04c5ac-kube-api-access-flhx7" (OuterVolumeSpecName: "kube-api-access-flhx7") pod "5e61919b-4848-43ec-8a16-6d752a04c5ac" (UID: "5e61919b-4848-43ec-8a16-6d752a04c5ac"). InnerVolumeSpecName "kube-api-access-flhx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.501882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e61919b-4848-43ec-8a16-6d752a04c5ac-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "5e61919b-4848-43ec-8a16-6d752a04c5ac" (UID: "5e61919b-4848-43ec-8a16-6d752a04c5ac"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.582760 4795 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5e61919b-4848-43ec-8a16-6d752a04c5ac-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 19 22:41:08 crc kubenswrapper[4795]: I0219 22:41:08.582800 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flhx7\" (UniqueName: \"kubernetes.io/projected/5e61919b-4848-43ec-8a16-6d752a04c5ac-kube-api-access-flhx7\") on node \"crc\" DevicePath \"\"" Feb 19 22:41:09 crc kubenswrapper[4795]: I0219 22:41:09.151395 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-dpkxw" event={"ID":"5e61919b-4848-43ec-8a16-6d752a04c5ac","Type":"ContainerDied","Data":"2ec619e5b6188cbce687e8d5092eede0d3323f72a885aad70e24f71d39f7a75b"} Feb 19 22:41:09 crc kubenswrapper[4795]: I0219 22:41:09.151437 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ec619e5b6188cbce687e8d5092eede0d3323f72a885aad70e24f71d39f7a75b" Feb 19 22:41:09 crc kubenswrapper[4795]: I0219 22:41:09.151466 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-dpkxw" Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.852319 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-dpkxw"] Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.863009 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-dpkxw"] Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.995602 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-h7bnb"] Feb 19 22:41:10 crc kubenswrapper[4795]: E0219 22:41:10.995871 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e61919b-4848-43ec-8a16-6d752a04c5ac" containerName="storage" Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.995883 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e61919b-4848-43ec-8a16-6d752a04c5ac" containerName="storage" Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.996033 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e61919b-4848-43ec-8a16-6d752a04c5ac" containerName="storage" Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.996471 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.998620 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.998663 4795 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-4rmf9" Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.998627 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 19 22:41:10 crc kubenswrapper[4795]: I0219 22:41:10.998894 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.010090 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-h7bnb"] Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.134868 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e4298263-238f-468e-b008-ebf615095a56-node-mnt\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.134916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e4298263-238f-468e-b008-ebf615095a56-crc-storage\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.135086 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v484\" (UniqueName: \"kubernetes.io/projected/e4298263-238f-468e-b008-ebf615095a56-kube-api-access-9v484\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.236633 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e4298263-238f-468e-b008-ebf615095a56-node-mnt\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.236686 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e4298263-238f-468e-b008-ebf615095a56-crc-storage\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.236722 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v484\" (UniqueName: \"kubernetes.io/projected/e4298263-238f-468e-b008-ebf615095a56-kube-api-access-9v484\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.237512 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e4298263-238f-468e-b008-ebf615095a56-crc-storage\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.238698 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e4298263-238f-468e-b008-ebf615095a56-node-mnt\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.257718 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v484\" (UniqueName: \"kubernetes.io/projected/e4298263-238f-468e-b008-ebf615095a56-kube-api-access-9v484\") pod \"crc-storage-crc-h7bnb\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.342696 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.521459 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e61919b-4848-43ec-8a16-6d752a04c5ac" path="/var/lib/kubelet/pods/5e61919b-4848-43ec-8a16-6d752a04c5ac/volumes" Feb 19 22:41:11 crc kubenswrapper[4795]: I0219 22:41:11.770073 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-h7bnb"] Feb 19 22:41:12 crc kubenswrapper[4795]: I0219 22:41:12.181493 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-h7bnb" event={"ID":"e4298263-238f-468e-b008-ebf615095a56","Type":"ContainerStarted","Data":"6d3623332377e39d8f770fbc105d739fb237218dab1b9b12419d65a8241cd9ff"} Feb 19 22:41:13 crc kubenswrapper[4795]: I0219 22:41:13.190221 4795 generic.go:334] "Generic (PLEG): container finished" podID="e4298263-238f-468e-b008-ebf615095a56" containerID="a3123c77876bceedf67b9501c6d94d5632c3bf437df7dc0f4cd344e9184635e6" exitCode=0 Feb 19 22:41:13 crc kubenswrapper[4795]: I0219 22:41:13.190452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-h7bnb" event={"ID":"e4298263-238f-468e-b008-ebf615095a56","Type":"ContainerDied","Data":"a3123c77876bceedf67b9501c6d94d5632c3bf437df7dc0f4cd344e9184635e6"} Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.571847 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.676067 4795 scope.go:117] "RemoveContainer" containerID="1e723054d3a11aa58d4ff2996b03eae32a1cee8cb004d6442cd324f9c14218e2" Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.695862 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v484\" (UniqueName: \"kubernetes.io/projected/e4298263-238f-468e-b008-ebf615095a56-kube-api-access-9v484\") pod \"e4298263-238f-468e-b008-ebf615095a56\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.695921 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e4298263-238f-468e-b008-ebf615095a56-crc-storage\") pod \"e4298263-238f-468e-b008-ebf615095a56\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.695973 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e4298263-238f-468e-b008-ebf615095a56-node-mnt\") pod \"e4298263-238f-468e-b008-ebf615095a56\" (UID: \"e4298263-238f-468e-b008-ebf615095a56\") " Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.696204 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4298263-238f-468e-b008-ebf615095a56-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "e4298263-238f-468e-b008-ebf615095a56" (UID: "e4298263-238f-468e-b008-ebf615095a56"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.703034 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4298263-238f-468e-b008-ebf615095a56-kube-api-access-9v484" (OuterVolumeSpecName: "kube-api-access-9v484") pod "e4298263-238f-468e-b008-ebf615095a56" (UID: "e4298263-238f-468e-b008-ebf615095a56"). InnerVolumeSpecName "kube-api-access-9v484". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.717564 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4298263-238f-468e-b008-ebf615095a56-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "e4298263-238f-468e-b008-ebf615095a56" (UID: "e4298263-238f-468e-b008-ebf615095a56"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.797232 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v484\" (UniqueName: \"kubernetes.io/projected/e4298263-238f-468e-b008-ebf615095a56-kube-api-access-9v484\") on node \"crc\" DevicePath \"\"" Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.797276 4795 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e4298263-238f-468e-b008-ebf615095a56-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 19 22:41:14 crc kubenswrapper[4795]: I0219 22:41:14.797288 4795 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e4298263-238f-468e-b008-ebf615095a56-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 19 22:41:15 crc kubenswrapper[4795]: I0219 22:41:15.222914 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-h7bnb" event={"ID":"e4298263-238f-468e-b008-ebf615095a56","Type":"ContainerDied","Data":"6d3623332377e39d8f770fbc105d739fb237218dab1b9b12419d65a8241cd9ff"} Feb 19 22:41:15 crc kubenswrapper[4795]: I0219 22:41:15.223379 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d3623332377e39d8f770fbc105d739fb237218dab1b9b12419d65a8241cd9ff" Feb 19 22:41:15 crc kubenswrapper[4795]: I0219 22:41:15.223341 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h7bnb" Feb 19 22:41:15 crc kubenswrapper[4795]: I0219 22:41:15.511787 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:41:15 crc kubenswrapper[4795]: E0219 22:41:15.512306 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:41:29 crc kubenswrapper[4795]: I0219 22:41:29.519070 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:41:29 crc kubenswrapper[4795]: E0219 22:41:29.519922 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:41:41 crc kubenswrapper[4795]: I0219 22:41:41.512660 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:41:41 crc kubenswrapper[4795]: E0219 22:41:41.514889 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:41:56 crc kubenswrapper[4795]: I0219 22:41:56.511613 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:41:56 crc kubenswrapper[4795]: E0219 22:41:56.512334 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:42:07 crc kubenswrapper[4795]: I0219 22:42:07.512199 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:42:07 crc kubenswrapper[4795]: E0219 22:42:07.513260 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:42:20 crc kubenswrapper[4795]: I0219 22:42:20.511687 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:42:20 crc kubenswrapper[4795]: E0219 22:42:20.512484 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:42:34 crc kubenswrapper[4795]: I0219 22:42:34.511605 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:42:34 crc kubenswrapper[4795]: E0219 22:42:34.512534 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:42:46 crc kubenswrapper[4795]: I0219 22:42:46.511994 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:42:46 crc kubenswrapper[4795]: E0219 22:42:46.512936 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.201453 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8pwv8"] Feb 19 22:42:57 crc kubenswrapper[4795]: E0219 22:42:57.202260 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4298263-238f-468e-b008-ebf615095a56" containerName="storage" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.202275 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4298263-238f-468e-b008-ebf615095a56" containerName="storage" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.202432 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4298263-238f-468e-b008-ebf615095a56" containerName="storage" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.203436 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.219726 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pwv8"] Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.293997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-utilities\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.294113 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgtjx\" (UniqueName: \"kubernetes.io/projected/2d5613ba-3d64-49d7-bba7-7b828e1c2948-kube-api-access-jgtjx\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.294277 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-catalog-content\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.395049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-catalog-content\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.395126 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-utilities\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.395178 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgtjx\" (UniqueName: \"kubernetes.io/projected/2d5613ba-3d64-49d7-bba7-7b828e1c2948-kube-api-access-jgtjx\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.395863 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-catalog-content\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.395971 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-utilities\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.416378 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgtjx\" (UniqueName: \"kubernetes.io/projected/2d5613ba-3d64-49d7-bba7-7b828e1c2948-kube-api-access-jgtjx\") pod \"redhat-marketplace-8pwv8\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.522631 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:42:57 crc kubenswrapper[4795]: I0219 22:42:57.976328 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pwv8"] Feb 19 22:42:58 crc kubenswrapper[4795]: I0219 22:42:58.511514 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:42:58 crc kubenswrapper[4795]: E0219 22:42:58.513608 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:42:58 crc kubenswrapper[4795]: I0219 22:42:58.972208 4795 generic.go:334] "Generic (PLEG): container finished" podID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerID="ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1" exitCode=0 Feb 19 22:42:58 crc kubenswrapper[4795]: I0219 22:42:58.972254 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pwv8" event={"ID":"2d5613ba-3d64-49d7-bba7-7b828e1c2948","Type":"ContainerDied","Data":"ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1"} Feb 19 22:42:58 crc kubenswrapper[4795]: I0219 22:42:58.972283 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pwv8" event={"ID":"2d5613ba-3d64-49d7-bba7-7b828e1c2948","Type":"ContainerStarted","Data":"2f63c0c313b1177cbb92f2f2f5734fd3bb3497394ee74fa185fbfa12b567e96a"} Feb 19 22:42:58 crc kubenswrapper[4795]: I0219 22:42:58.975308 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:42:59 crc kubenswrapper[4795]: I0219 22:42:59.986114 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pwv8" event={"ID":"2d5613ba-3d64-49d7-bba7-7b828e1c2948","Type":"ContainerStarted","Data":"28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e"} Feb 19 22:43:00 crc kubenswrapper[4795]: I0219 22:43:00.994940 4795 generic.go:334] "Generic (PLEG): container finished" podID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerID="28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e" exitCode=0 Feb 19 22:43:00 crc kubenswrapper[4795]: I0219 22:43:00.994983 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pwv8" event={"ID":"2d5613ba-3d64-49d7-bba7-7b828e1c2948","Type":"ContainerDied","Data":"28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e"} Feb 19 22:43:02 crc kubenswrapper[4795]: I0219 22:43:02.004863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pwv8" event={"ID":"2d5613ba-3d64-49d7-bba7-7b828e1c2948","Type":"ContainerStarted","Data":"d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22"} Feb 19 22:43:02 crc kubenswrapper[4795]: I0219 22:43:02.026409 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8pwv8" podStartSLOduration=2.374421014 podStartE2EDuration="5.026389137s" podCreationTimestamp="2026-02-19 22:42:57 +0000 UTC" firstStartedPulling="2026-02-19 22:42:58.974860907 +0000 UTC m=+4490.167378811" lastFinishedPulling="2026-02-19 22:43:01.62682907 +0000 UTC m=+4492.819346934" observedRunningTime="2026-02-19 22:43:02.025156011 +0000 UTC m=+4493.217673945" watchObservedRunningTime="2026-02-19 22:43:02.026389137 +0000 UTC m=+4493.218907001" Feb 19 22:43:07 crc kubenswrapper[4795]: I0219 22:43:07.523301 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:43:07 crc kubenswrapper[4795]: I0219 22:43:07.524335 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:43:07 crc kubenswrapper[4795]: I0219 22:43:07.569155 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:43:08 crc kubenswrapper[4795]: I0219 22:43:08.109673 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:43:08 crc kubenswrapper[4795]: I0219 22:43:08.179091 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pwv8"] Feb 19 22:43:09 crc kubenswrapper[4795]: I0219 22:43:09.521026 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:43:09 crc kubenswrapper[4795]: E0219 22:43:09.522216 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.064910 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8pwv8" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerName="registry-server" containerID="cri-o://d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22" gracePeriod=2 Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.564000 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.691267 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-utilities\") pod \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.691645 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-catalog-content\") pod \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.691718 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgtjx\" (UniqueName: \"kubernetes.io/projected/2d5613ba-3d64-49d7-bba7-7b828e1c2948-kube-api-access-jgtjx\") pod \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\" (UID: \"2d5613ba-3d64-49d7-bba7-7b828e1c2948\") " Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.694251 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-utilities" (OuterVolumeSpecName: "utilities") pod "2d5613ba-3d64-49d7-bba7-7b828e1c2948" (UID: "2d5613ba-3d64-49d7-bba7-7b828e1c2948"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.696877 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d5613ba-3d64-49d7-bba7-7b828e1c2948-kube-api-access-jgtjx" (OuterVolumeSpecName: "kube-api-access-jgtjx") pod "2d5613ba-3d64-49d7-bba7-7b828e1c2948" (UID: "2d5613ba-3d64-49d7-bba7-7b828e1c2948"). InnerVolumeSpecName "kube-api-access-jgtjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.715533 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d5613ba-3d64-49d7-bba7-7b828e1c2948" (UID: "2d5613ba-3d64-49d7-bba7-7b828e1c2948"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.793408 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.793443 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d5613ba-3d64-49d7-bba7-7b828e1c2948-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:43:10 crc kubenswrapper[4795]: I0219 22:43:10.793458 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgtjx\" (UniqueName: \"kubernetes.io/projected/2d5613ba-3d64-49d7-bba7-7b828e1c2948-kube-api-access-jgtjx\") on node \"crc\" DevicePath \"\"" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.075034 4795 generic.go:334] "Generic (PLEG): container finished" podID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerID="d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22" exitCode=0 Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.075103 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pwv8" event={"ID":"2d5613ba-3d64-49d7-bba7-7b828e1c2948","Type":"ContainerDied","Data":"d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22"} Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.075223 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pwv8" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.075252 4795 scope.go:117] "RemoveContainer" containerID="d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.075238 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pwv8" event={"ID":"2d5613ba-3d64-49d7-bba7-7b828e1c2948","Type":"ContainerDied","Data":"2f63c0c313b1177cbb92f2f2f5734fd3bb3497394ee74fa185fbfa12b567e96a"} Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.103093 4795 scope.go:117] "RemoveContainer" containerID="28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.127672 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pwv8"] Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.138021 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pwv8"] Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.141184 4795 scope.go:117] "RemoveContainer" containerID="ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.159719 4795 scope.go:117] "RemoveContainer" containerID="d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22" Feb 19 22:43:11 crc kubenswrapper[4795]: E0219 22:43:11.160329 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22\": container with ID starting with d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22 not found: ID does not exist" containerID="d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.160369 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22"} err="failed to get container status \"d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22\": rpc error: code = NotFound desc = could not find container \"d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22\": container with ID starting with d4bef00c8fcb6636b96945d0469617b366fc37593fd54b1ea94e9ca5c207ab22 not found: ID does not exist" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.160395 4795 scope.go:117] "RemoveContainer" containerID="28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e" Feb 19 22:43:11 crc kubenswrapper[4795]: E0219 22:43:11.160744 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e\": container with ID starting with 28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e not found: ID does not exist" containerID="28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.160804 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e"} err="failed to get container status \"28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e\": rpc error: code = NotFound desc = could not find container \"28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e\": container with ID starting with 28c5efc54603a1cc911245b17436015b2671604c79ee3a551ddf2bc0c76f632e not found: ID does not exist" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.160847 4795 scope.go:117] "RemoveContainer" containerID="ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1" Feb 19 22:43:11 crc kubenswrapper[4795]: E0219 22:43:11.161362 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1\": container with ID starting with ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1 not found: ID does not exist" containerID="ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.161402 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1"} err="failed to get container status \"ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1\": rpc error: code = NotFound desc = could not find container \"ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1\": container with ID starting with ee34c575257d01581af5973462b336155233c634cf235cab5c2bdd649b3174f1 not found: ID does not exist" Feb 19 22:43:11 crc kubenswrapper[4795]: I0219 22:43:11.526669 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" path="/var/lib/kubelet/pods/2d5613ba-3d64-49d7-bba7-7b828e1c2948/volumes" Feb 19 22:43:19 crc kubenswrapper[4795]: I0219 22:43:19.843076 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:43:19 crc kubenswrapper[4795]: E0219 22:43:19.845090 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:43:33 crc kubenswrapper[4795]: I0219 22:43:33.514492 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:43:33 crc kubenswrapper[4795]: E0219 22:43:33.515505 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:43:45 crc kubenswrapper[4795]: I0219 22:43:45.512587 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:43:45 crc kubenswrapper[4795]: E0219 22:43:45.513768 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:43:59 crc kubenswrapper[4795]: I0219 22:43:59.516845 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:43:59 crc kubenswrapper[4795]: E0219 22:43:59.517627 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:44:13 crc kubenswrapper[4795]: I0219 22:44:13.515499 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:44:13 crc kubenswrapper[4795]: E0219 22:44:13.516813 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:44:24 crc kubenswrapper[4795]: I0219 22:44:24.512071 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:44:24 crc kubenswrapper[4795]: E0219 22:44:24.512716 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.444503 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-rn4dm"] Feb 19 22:44:28 crc kubenswrapper[4795]: E0219 22:44:28.445095 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerName="registry-server" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.445108 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerName="registry-server" Feb 19 22:44:28 crc kubenswrapper[4795]: E0219 22:44:28.445122 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerName="extract-utilities" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.445128 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerName="extract-utilities" Feb 19 22:44:28 crc kubenswrapper[4795]: E0219 22:44:28.445144 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerName="extract-content" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.445150 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerName="extract-content" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.445299 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d5613ba-3d64-49d7-bba7-7b828e1c2948" containerName="registry-server" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.445944 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.448134 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.448196 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.448659 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.449880 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.451511 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bbh84" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.469713 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-rn4dm"] Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.604976 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhmhn\" (UniqueName: \"kubernetes.io/projected/b743a36e-23aa-4a29-b400-a91ed0788bd7-kube-api-access-lhmhn\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.605418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-config\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.605453 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.706600 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-config\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.706675 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.706745 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhmhn\" (UniqueName: \"kubernetes.io/projected/b743a36e-23aa-4a29-b400-a91ed0788bd7-kube-api-access-lhmhn\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.707670 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-config\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.707734 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.713422 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-vf5s4"] Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.714990 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.733810 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-vf5s4"] Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.740031 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhmhn\" (UniqueName: \"kubernetes.io/projected/b743a36e-23aa-4a29-b400-a91ed0788bd7-kube-api-access-lhmhn\") pod \"dnsmasq-dns-7c4c8f55b5-rn4dm\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.808579 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-784t4\" (UniqueName: \"kubernetes.io/projected/cc119db7-03db-4838-b663-f244b7f93433-kube-api-access-784t4\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.808684 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-config\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.808777 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-dns-svc\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.814383 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.910586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-784t4\" (UniqueName: \"kubernetes.io/projected/cc119db7-03db-4838-b663-f244b7f93433-kube-api-access-784t4\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.910656 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-config\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.910683 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-dns-svc\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.912054 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-dns-svc\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.912109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-config\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:28 crc kubenswrapper[4795]: I0219 22:44:28.933127 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-784t4\" (UniqueName: \"kubernetes.io/projected/cc119db7-03db-4838-b663-f244b7f93433-kube-api-access-784t4\") pod \"dnsmasq-dns-589cf688cc-vf5s4\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.029633 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.270193 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-rn4dm"] Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.483000 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-vf5s4"] Feb 19 22:44:29 crc kubenswrapper[4795]: W0219 22:44:29.524790 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc119db7_03db_4838_b663_f244b7f93433.slice/crio-8e504bc8e8afcb8d881cb268684b0b1a7bedc4275f260a1ee4b205b460ca8f3e WatchSource:0}: Error finding container 8e504bc8e8afcb8d881cb268684b0b1a7bedc4275f260a1ee4b205b460ca8f3e: Status 404 returned error can't find the container with id 8e504bc8e8afcb8d881cb268684b0b1a7bedc4275f260a1ee4b205b460ca8f3e Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.590912 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.592255 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.593765 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.603669 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.603671 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.603733 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.604029 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rbx8c" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.604486 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723258 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723304 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9cd04173-2975-46bd-8602-f6561387d717-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723325 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723509 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723527 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnbmk\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-kube-api-access-wnbmk\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723558 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723635 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.723701 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9cd04173-2975-46bd-8602-f6561387d717-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.749790 4795 generic.go:334] "Generic (PLEG): container finished" podID="b743a36e-23aa-4a29-b400-a91ed0788bd7" containerID="e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7" exitCode=0 Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.749880 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" event={"ID":"b743a36e-23aa-4a29-b400-a91ed0788bd7","Type":"ContainerDied","Data":"e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7"} Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.749913 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" event={"ID":"b743a36e-23aa-4a29-b400-a91ed0788bd7","Type":"ContainerStarted","Data":"01c621c9ebf25431f41781d9a945a324e3f1e0ba1f3afbd4aaf02e91fa196557"} Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.751385 4795 generic.go:334] "Generic (PLEG): container finished" podID="cc119db7-03db-4838-b663-f244b7f93433" containerID="6c19182bc802a37ab3a253eaa33c548a0f3f15e1ee3c00a8c016db959a5b8557" exitCode=0 Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.751444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" event={"ID":"cc119db7-03db-4838-b663-f244b7f93433","Type":"ContainerDied","Data":"6c19182bc802a37ab3a253eaa33c548a0f3f15e1ee3c00a8c016db959a5b8557"} Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.751460 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" event={"ID":"cc119db7-03db-4838-b663-f244b7f93433","Type":"ContainerStarted","Data":"8e504bc8e8afcb8d881cb268684b0b1a7bedc4275f260a1ee4b205b460ca8f3e"} Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.826326 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.826506 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.826635 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9cd04173-2975-46bd-8602-f6561387d717-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.826787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.826843 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9cd04173-2975-46bd-8602-f6561387d717-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.826901 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.827032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.827133 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.827212 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnbmk\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-kube-api-access-wnbmk\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.828605 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.829273 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.830786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.833296 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.833328 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/68b77c70021788ffaf78dfe86ddece9c7d3d5c9cffb40f46dc15f8b79fc094aa/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.833772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.835879 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9cd04173-2975-46bd-8602-f6561387d717-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.835993 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.836279 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9cd04173-2975-46bd-8602-f6561387d717-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.851687 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnbmk\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-kube-api-access-wnbmk\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.908637 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"rabbitmq-server-0\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " pod="openstack/rabbitmq-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.908664 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.914849 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.919985 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.920051 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.920213 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.920587 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.922607 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rh88g" Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.928818 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:44:29 crc kubenswrapper[4795]: I0219 22:44:29.955846 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.030471 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.030873 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.031115 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.031317 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.031477 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24vlt\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-kube-api-access-24vlt\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.031625 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.031830 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.031989 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.032122 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.133839 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.133882 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.135224 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.135275 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.135309 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.135365 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.135409 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.135453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24vlt\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-kube-api-access-24vlt\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.135476 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.136886 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.137257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.138083 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.138437 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.142804 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.149303 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.149514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.150311 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.150354 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ce7e4e38eafbc5e4f450d6daa37e1c166de00b3db9bc92a3ff92b374f626b390/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.168308 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24vlt\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-kube-api-access-24vlt\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.186353 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.242972 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.418051 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:44:30 crc kubenswrapper[4795]: W0219 22:44:30.421956 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cd04173_2975_46bd_8602_f6561387d717.slice/crio-8ac632b756081c55272b5e493bca0401eb5a5d61edfe7cea78f2545acf8b31bd WatchSource:0}: Error finding container 8ac632b756081c55272b5e493bca0401eb5a5d61edfe7cea78f2545acf8b31bd: Status 404 returned error can't find the container with id 8ac632b756081c55272b5e493bca0401eb5a5d61edfe7cea78f2545acf8b31bd Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.658735 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:44:30 crc kubenswrapper[4795]: W0219 22:44:30.663491 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd43ca2d_c7e3_4fc7_84a5_74b50cadd268.slice/crio-e602b9b95a59c91b0341f4b50dac51b129fce0661e01cc211b5bddd7e49efb66 WatchSource:0}: Error finding container e602b9b95a59c91b0341f4b50dac51b129fce0661e01cc211b5bddd7e49efb66: Status 404 returned error can't find the container with id e602b9b95a59c91b0341f4b50dac51b129fce0661e01cc211b5bddd7e49efb66 Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.769733 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" event={"ID":"b743a36e-23aa-4a29-b400-a91ed0788bd7","Type":"ContainerStarted","Data":"72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712"} Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.770189 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.772904 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9cd04173-2975-46bd-8602-f6561387d717","Type":"ContainerStarted","Data":"8ac632b756081c55272b5e493bca0401eb5a5d61edfe7cea78f2545acf8b31bd"} Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.778918 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" event={"ID":"cc119db7-03db-4838-b663-f244b7f93433","Type":"ContainerStarted","Data":"64136e69fe434b4b0bfaa8f633f2393606df65c80063c9e28ce2ef61c86da556"} Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.779133 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.780677 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268","Type":"ContainerStarted","Data":"e602b9b95a59c91b0341f4b50dac51b129fce0661e01cc211b5bddd7e49efb66"} Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.795714 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" podStartSLOduration=2.795685182 podStartE2EDuration="2.795685182s" podCreationTimestamp="2026-02-19 22:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:44:30.791212003 +0000 UTC m=+4581.983729857" watchObservedRunningTime="2026-02-19 22:44:30.795685182 +0000 UTC m=+4581.988203076" Feb 19 22:44:30 crc kubenswrapper[4795]: I0219 22:44:30.810092 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" podStartSLOduration=2.8100759760000003 podStartE2EDuration="2.810075976s" podCreationTimestamp="2026-02-19 22:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:44:30.806532534 +0000 UTC m=+4581.999050398" watchObservedRunningTime="2026-02-19 22:44:30.810075976 +0000 UTC m=+4582.002593830" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.244866 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.246876 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.249783 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-phh6q" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.250058 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.250072 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.250878 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.260343 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.271988 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.352444 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.352558 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-kolla-config\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.352610 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxqtp\" (UniqueName: \"kubernetes.io/projected/24345708-df30-4486-bc7e-44eaa7722ffd-kube-api-access-cxqtp\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.352643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/24345708-df30-4486-bc7e-44eaa7722ffd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.352682 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/24345708-df30-4486-bc7e-44eaa7722ffd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.352708 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-config-data-default\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.352803 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24345708-df30-4486-bc7e-44eaa7722ffd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.352936 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f83c3f4a-c115-4d4b-8143-ecab40616954\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f83c3f4a-c115-4d4b-8143-ecab40616954\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.454713 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.454774 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-kolla-config\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.454803 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxqtp\" (UniqueName: \"kubernetes.io/projected/24345708-df30-4486-bc7e-44eaa7722ffd-kube-api-access-cxqtp\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.454820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/24345708-df30-4486-bc7e-44eaa7722ffd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.454850 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/24345708-df30-4486-bc7e-44eaa7722ffd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.454873 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-config-data-default\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.454894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24345708-df30-4486-bc7e-44eaa7722ffd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.454936 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f83c3f4a-c115-4d4b-8143-ecab40616954\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f83c3f4a-c115-4d4b-8143-ecab40616954\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.455776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/24345708-df30-4486-bc7e-44eaa7722ffd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.456436 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-kolla-config\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.456553 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-config-data-default\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.456864 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24345708-df30-4486-bc7e-44eaa7722ffd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.460706 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/24345708-df30-4486-bc7e-44eaa7722ffd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.463910 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24345708-df30-4486-bc7e-44eaa7722ffd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.464060 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.464144 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f83c3f4a-c115-4d4b-8143-ecab40616954\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f83c3f4a-c115-4d4b-8143-ecab40616954\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1165ef74fbcf4a9deee3143388ab045d5c4f0facc83de427042e5c5065a01ca4/globalmount\"" pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.476104 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxqtp\" (UniqueName: \"kubernetes.io/projected/24345708-df30-4486-bc7e-44eaa7722ffd-kube-api-access-cxqtp\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.494173 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f83c3f4a-c115-4d4b-8143-ecab40616954\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f83c3f4a-c115-4d4b-8143-ecab40616954\") pod \"openstack-galera-0\" (UID: \"24345708-df30-4486-bc7e-44eaa7722ffd\") " pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.572791 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.631092 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.631959 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.635663 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-r89zr" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.636914 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.641804 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.765944 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d516d65-1efc-42ee-ab17-971e2d94e4a7-kolla-config\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.766339 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d516d65-1efc-42ee-ab17-971e2d94e4a7-config-data\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.766361 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxdjr\" (UniqueName: \"kubernetes.io/projected/3d516d65-1efc-42ee-ab17-971e2d94e4a7-kube-api-access-bxdjr\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.788395 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9cd04173-2975-46bd-8602-f6561387d717","Type":"ContainerStarted","Data":"ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a"} Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.792793 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268","Type":"ContainerStarted","Data":"48d9ac5474c94a11a6da74ec61c602260d05f7961e1a831acf6b59d650ddb95f"} Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.867563 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d516d65-1efc-42ee-ab17-971e2d94e4a7-kolla-config\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.867698 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d516d65-1efc-42ee-ab17-971e2d94e4a7-config-data\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.867727 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxdjr\" (UniqueName: \"kubernetes.io/projected/3d516d65-1efc-42ee-ab17-971e2d94e4a7-kube-api-access-bxdjr\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.868962 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d516d65-1efc-42ee-ab17-971e2d94e4a7-kolla-config\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.869029 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d516d65-1efc-42ee-ab17-971e2d94e4a7-config-data\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.884066 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxdjr\" (UniqueName: \"kubernetes.io/projected/3d516d65-1efc-42ee-ab17-971e2d94e4a7-kube-api-access-bxdjr\") pod \"memcached-0\" (UID: \"3d516d65-1efc-42ee-ab17-971e2d94e4a7\") " pod="openstack/memcached-0" Feb 19 22:44:31 crc kubenswrapper[4795]: I0219 22:44:31.991992 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.102864 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 22:44:32 crc kubenswrapper[4795]: W0219 22:44:32.111764 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24345708_df30_4486_bc7e_44eaa7722ffd.slice/crio-cf85a9ef7b807f9974ab2cbe53b98edbf6af52cd1621fe033355c6cbf1228f54 WatchSource:0}: Error finding container cf85a9ef7b807f9974ab2cbe53b98edbf6af52cd1621fe033355c6cbf1228f54: Status 404 returned error can't find the container with id cf85a9ef7b807f9974ab2cbe53b98edbf6af52cd1621fe033355c6cbf1228f54 Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.459982 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.719439 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.720600 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.724330 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-dpr8j" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.725423 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.725695 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.729664 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.745439 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.784371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.784520 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c8f55130-d799-45ef-b174-450b6c3b52ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.784555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.784595 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a8efa8ec-ded9-4ef5-a87c-fbff73e9dea7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a8efa8ec-ded9-4ef5-a87c-fbff73e9dea7\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.784641 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8f55130-d799-45ef-b174-450b6c3b52ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.784698 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f55130-d799-45ef-b174-450b6c3b52ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.784738 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56864\" (UniqueName: \"kubernetes.io/projected/c8f55130-d799-45ef-b174-450b6c3b52ff-kube-api-access-56864\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.784809 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.802648 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3d516d65-1efc-42ee-ab17-971e2d94e4a7","Type":"ContainerStarted","Data":"542e1c232194402e8e8b9ea4bb6f613c9c7838e74833e76b1c80ddf15722c8b6"} Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.812884 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"24345708-df30-4486-bc7e-44eaa7722ffd","Type":"ContainerStarted","Data":"98338aaa8b7058583c69c89378f49b27fd375f134eb71210d2b4a7d131e400c2"} Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.812935 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"24345708-df30-4486-bc7e-44eaa7722ffd","Type":"ContainerStarted","Data":"cf85a9ef7b807f9974ab2cbe53b98edbf6af52cd1621fe033355c6cbf1228f54"} Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.896929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a8efa8ec-ded9-4ef5-a87c-fbff73e9dea7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a8efa8ec-ded9-4ef5-a87c-fbff73e9dea7\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.896987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8f55130-d799-45ef-b174-450b6c3b52ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.897074 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f55130-d799-45ef-b174-450b6c3b52ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.897119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56864\" (UniqueName: \"kubernetes.io/projected/c8f55130-d799-45ef-b174-450b6c3b52ff-kube-api-access-56864\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.897198 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.897530 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.897697 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c8f55130-d799-45ef-b174-450b6c3b52ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.897729 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.901598 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c8f55130-d799-45ef-b174-450b6c3b52ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.903641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.905666 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8f55130-d799-45ef-b174-450b6c3b52ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.909367 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f55130-d799-45ef-b174-450b6c3b52ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.909607 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:32 crc kubenswrapper[4795]: I0219 22:44:32.917503 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c8f55130-d799-45ef-b174-450b6c3b52ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:33 crc kubenswrapper[4795]: I0219 22:44:33.026019 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:44:33 crc kubenswrapper[4795]: I0219 22:44:33.026087 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a8efa8ec-ded9-4ef5-a87c-fbff73e9dea7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a8efa8ec-ded9-4ef5-a87c-fbff73e9dea7\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a8f4eba54544a651e5637ee119bec0f221e08393ed9dbf224d2d4dc3517dd96b/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:33 crc kubenswrapper[4795]: I0219 22:44:33.027820 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56864\" (UniqueName: \"kubernetes.io/projected/c8f55130-d799-45ef-b174-450b6c3b52ff-kube-api-access-56864\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:33 crc kubenswrapper[4795]: I0219 22:44:33.056440 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a8efa8ec-ded9-4ef5-a87c-fbff73e9dea7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a8efa8ec-ded9-4ef5-a87c-fbff73e9dea7\") pod \"openstack-cell1-galera-0\" (UID: \"c8f55130-d799-45ef-b174-450b6c3b52ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:33 crc kubenswrapper[4795]: I0219 22:44:33.351651 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:33 crc kubenswrapper[4795]: I0219 22:44:33.827556 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 22:44:33 crc kubenswrapper[4795]: I0219 22:44:33.827835 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3d516d65-1efc-42ee-ab17-971e2d94e4a7","Type":"ContainerStarted","Data":"e8e119ce9753532a404fdc5789740424a476dbf17064d349b96fc359a962c90b"} Feb 19 22:44:33 crc kubenswrapper[4795]: I0219 22:44:33.847399 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.8473679450000002 podStartE2EDuration="2.847367945s" podCreationTimestamp="2026-02-19 22:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:44:33.839884959 +0000 UTC m=+4585.032402863" watchObservedRunningTime="2026-02-19 22:44:33.847367945 +0000 UTC m=+4585.039885829" Feb 19 22:44:34 crc kubenswrapper[4795]: W0219 22:44:34.132543 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8f55130_d799_45ef_b174_450b6c3b52ff.slice/crio-0db400379163aac0fd1a842dce6434bcf2d05bbf475831f1adc63f9acfbd742f WatchSource:0}: Error finding container 0db400379163aac0fd1a842dce6434bcf2d05bbf475831f1adc63f9acfbd742f: Status 404 returned error can't find the container with id 0db400379163aac0fd1a842dce6434bcf2d05bbf475831f1adc63f9acfbd742f Feb 19 22:44:34 crc kubenswrapper[4795]: I0219 22:44:34.836477 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c8f55130-d799-45ef-b174-450b6c3b52ff","Type":"ContainerStarted","Data":"c0a61604098544497ff7d80649240bf23ed9ae4b34dabfe24a1174f77e04b797"} Feb 19 22:44:34 crc kubenswrapper[4795]: I0219 22:44:34.836816 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 19 22:44:34 crc kubenswrapper[4795]: I0219 22:44:34.836832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c8f55130-d799-45ef-b174-450b6c3b52ff","Type":"ContainerStarted","Data":"0db400379163aac0fd1a842dce6434bcf2d05bbf475831f1adc63f9acfbd742f"} Feb 19 22:44:35 crc kubenswrapper[4795]: I0219 22:44:35.512237 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:44:35 crc kubenswrapper[4795]: I0219 22:44:35.846977 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"85a7b1a9bb1dc26b66551125ca13bcd3b5f6f4015b61adf891b31e1f3b13c640"} Feb 19 22:44:36 crc kubenswrapper[4795]: I0219 22:44:36.854786 4795 generic.go:334] "Generic (PLEG): container finished" podID="24345708-df30-4486-bc7e-44eaa7722ffd" containerID="98338aaa8b7058583c69c89378f49b27fd375f134eb71210d2b4a7d131e400c2" exitCode=0 Feb 19 22:44:36 crc kubenswrapper[4795]: I0219 22:44:36.854871 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"24345708-df30-4486-bc7e-44eaa7722ffd","Type":"ContainerDied","Data":"98338aaa8b7058583c69c89378f49b27fd375f134eb71210d2b4a7d131e400c2"} Feb 19 22:44:37 crc kubenswrapper[4795]: I0219 22:44:37.861683 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8f55130-d799-45ef-b174-450b6c3b52ff" containerID="c0a61604098544497ff7d80649240bf23ed9ae4b34dabfe24a1174f77e04b797" exitCode=0 Feb 19 22:44:37 crc kubenswrapper[4795]: I0219 22:44:37.862248 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c8f55130-d799-45ef-b174-450b6c3b52ff","Type":"ContainerDied","Data":"c0a61604098544497ff7d80649240bf23ed9ae4b34dabfe24a1174f77e04b797"} Feb 19 22:44:37 crc kubenswrapper[4795]: I0219 22:44:37.864548 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"24345708-df30-4486-bc7e-44eaa7722ffd","Type":"ContainerStarted","Data":"09d226b956b0ebbd3ab43c1ec12ae8658d98faece1941eba3a2d96a00df4303c"} Feb 19 22:44:37 crc kubenswrapper[4795]: I0219 22:44:37.926908 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.926886218 podStartE2EDuration="7.926886218s" podCreationTimestamp="2026-02-19 22:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:44:37.919735852 +0000 UTC m=+4589.112253806" watchObservedRunningTime="2026-02-19 22:44:37.926886218 +0000 UTC m=+4589.119404092" Feb 19 22:44:38 crc kubenswrapper[4795]: I0219 22:44:38.817910 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:38 crc kubenswrapper[4795]: I0219 22:44:38.880251 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c8f55130-d799-45ef-b174-450b6c3b52ff","Type":"ContainerStarted","Data":"b87c22825038b02f1820139804d67606a5e717ce2721120327186d2e9efd4ae2"} Feb 19 22:44:38 crc kubenswrapper[4795]: I0219 22:44:38.918852 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.918827735 podStartE2EDuration="7.918827735s" podCreationTimestamp="2026-02-19 22:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:44:38.91554633 +0000 UTC m=+4590.108064194" watchObservedRunningTime="2026-02-19 22:44:38.918827735 +0000 UTC m=+4590.111345629" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.032455 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.097484 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-rn4dm"] Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.097729 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" podUID="b743a36e-23aa-4a29-b400-a91ed0788bd7" containerName="dnsmasq-dns" containerID="cri-o://72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712" gracePeriod=10 Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.625863 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.731647 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-dns-svc\") pod \"b743a36e-23aa-4a29-b400-a91ed0788bd7\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.731800 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-config\") pod \"b743a36e-23aa-4a29-b400-a91ed0788bd7\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.731850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhmhn\" (UniqueName: \"kubernetes.io/projected/b743a36e-23aa-4a29-b400-a91ed0788bd7-kube-api-access-lhmhn\") pod \"b743a36e-23aa-4a29-b400-a91ed0788bd7\" (UID: \"b743a36e-23aa-4a29-b400-a91ed0788bd7\") " Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.739413 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b743a36e-23aa-4a29-b400-a91ed0788bd7-kube-api-access-lhmhn" (OuterVolumeSpecName: "kube-api-access-lhmhn") pod "b743a36e-23aa-4a29-b400-a91ed0788bd7" (UID: "b743a36e-23aa-4a29-b400-a91ed0788bd7"). InnerVolumeSpecName "kube-api-access-lhmhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.763422 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b743a36e-23aa-4a29-b400-a91ed0788bd7" (UID: "b743a36e-23aa-4a29-b400-a91ed0788bd7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.772520 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-config" (OuterVolumeSpecName: "config") pod "b743a36e-23aa-4a29-b400-a91ed0788bd7" (UID: "b743a36e-23aa-4a29-b400-a91ed0788bd7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.833716 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.833749 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b743a36e-23aa-4a29-b400-a91ed0788bd7-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.833759 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhmhn\" (UniqueName: \"kubernetes.io/projected/b743a36e-23aa-4a29-b400-a91ed0788bd7-kube-api-access-lhmhn\") on node \"crc\" DevicePath \"\"" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.910922 4795 generic.go:334] "Generic (PLEG): container finished" podID="b743a36e-23aa-4a29-b400-a91ed0788bd7" containerID="72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712" exitCode=0 Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.910985 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" event={"ID":"b743a36e-23aa-4a29-b400-a91ed0788bd7","Type":"ContainerDied","Data":"72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712"} Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.911018 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" event={"ID":"b743a36e-23aa-4a29-b400-a91ed0788bd7","Type":"ContainerDied","Data":"01c621c9ebf25431f41781d9a945a324e3f1e0ba1f3afbd4aaf02e91fa196557"} Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.911036 4795 scope.go:117] "RemoveContainer" containerID="72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.911288 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-rn4dm" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.934424 4795 scope.go:117] "RemoveContainer" containerID="e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.962548 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-rn4dm"] Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.967454 4795 scope.go:117] "RemoveContainer" containerID="72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.971243 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-rn4dm"] Feb 19 22:44:39 crc kubenswrapper[4795]: E0219 22:44:39.973334 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712\": container with ID starting with 72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712 not found: ID does not exist" containerID="72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.973391 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712"} err="failed to get container status \"72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712\": rpc error: code = NotFound desc = could not find container \"72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712\": container with ID starting with 72948b55c5297e74f47fec1c5b0abad9248f67622c9adf8cda348359b3335712 not found: ID does not exist" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.973432 4795 scope.go:117] "RemoveContainer" containerID="e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7" Feb 19 22:44:39 crc kubenswrapper[4795]: E0219 22:44:39.975766 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7\": container with ID starting with e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7 not found: ID does not exist" containerID="e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7" Feb 19 22:44:39 crc kubenswrapper[4795]: I0219 22:44:39.975812 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7"} err="failed to get container status \"e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7\": rpc error: code = NotFound desc = could not find container \"e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7\": container with ID starting with e20ade87f2be3af18d652fc7a8d168566bbd9d8f1fa0af3e00e5b04add98e2f7 not found: ID does not exist" Feb 19 22:44:41 crc kubenswrapper[4795]: I0219 22:44:41.522508 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b743a36e-23aa-4a29-b400-a91ed0788bd7" path="/var/lib/kubelet/pods/b743a36e-23aa-4a29-b400-a91ed0788bd7/volumes" Feb 19 22:44:41 crc kubenswrapper[4795]: I0219 22:44:41.573010 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 19 22:44:41 crc kubenswrapper[4795]: I0219 22:44:41.573539 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 19 22:44:41 crc kubenswrapper[4795]: I0219 22:44:41.712326 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 19 22:44:41 crc kubenswrapper[4795]: I0219 22:44:41.995020 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 19 22:44:41 crc kubenswrapper[4795]: I0219 22:44:41.999707 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 19 22:44:43 crc kubenswrapper[4795]: I0219 22:44:43.352058 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:43 crc kubenswrapper[4795]: I0219 22:44:43.352516 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:43 crc kubenswrapper[4795]: I0219 22:44:43.741597 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:44 crc kubenswrapper[4795]: I0219 22:44:44.030566 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.245108 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-qr9lx"] Feb 19 22:44:50 crc kubenswrapper[4795]: E0219 22:44:50.246160 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b743a36e-23aa-4a29-b400-a91ed0788bd7" containerName="init" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.246212 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b743a36e-23aa-4a29-b400-a91ed0788bd7" containerName="init" Feb 19 22:44:50 crc kubenswrapper[4795]: E0219 22:44:50.246280 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b743a36e-23aa-4a29-b400-a91ed0788bd7" containerName="dnsmasq-dns" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.246298 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b743a36e-23aa-4a29-b400-a91ed0788bd7" containerName="dnsmasq-dns" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.246595 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b743a36e-23aa-4a29-b400-a91ed0788bd7" containerName="dnsmasq-dns" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.247445 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.250375 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.262784 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qr9lx"] Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.425641 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/566c3329-8a98-426c-a847-7bdf7df37653-operator-scripts\") pod \"root-account-create-update-qr9lx\" (UID: \"566c3329-8a98-426c-a847-7bdf7df37653\") " pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.425718 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7tc2\" (UniqueName: \"kubernetes.io/projected/566c3329-8a98-426c-a847-7bdf7df37653-kube-api-access-x7tc2\") pod \"root-account-create-update-qr9lx\" (UID: \"566c3329-8a98-426c-a847-7bdf7df37653\") " pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.526608 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/566c3329-8a98-426c-a847-7bdf7df37653-operator-scripts\") pod \"root-account-create-update-qr9lx\" (UID: \"566c3329-8a98-426c-a847-7bdf7df37653\") " pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.527088 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7tc2\" (UniqueName: \"kubernetes.io/projected/566c3329-8a98-426c-a847-7bdf7df37653-kube-api-access-x7tc2\") pod \"root-account-create-update-qr9lx\" (UID: \"566c3329-8a98-426c-a847-7bdf7df37653\") " pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.528357 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/566c3329-8a98-426c-a847-7bdf7df37653-operator-scripts\") pod \"root-account-create-update-qr9lx\" (UID: \"566c3329-8a98-426c-a847-7bdf7df37653\") " pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.547471 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7tc2\" (UniqueName: \"kubernetes.io/projected/566c3329-8a98-426c-a847-7bdf7df37653-kube-api-access-x7tc2\") pod \"root-account-create-update-qr9lx\" (UID: \"566c3329-8a98-426c-a847-7bdf7df37653\") " pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:50 crc kubenswrapper[4795]: I0219 22:44:50.591015 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:51 crc kubenswrapper[4795]: W0219 22:44:51.095407 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod566c3329_8a98_426c_a847_7bdf7df37653.slice/crio-4b7825ca392d02ae5a4b56bbd0154b28a035e4c142dd0204cbcf271db6886746 WatchSource:0}: Error finding container 4b7825ca392d02ae5a4b56bbd0154b28a035e4c142dd0204cbcf271db6886746: Status 404 returned error can't find the container with id 4b7825ca392d02ae5a4b56bbd0154b28a035e4c142dd0204cbcf271db6886746 Feb 19 22:44:51 crc kubenswrapper[4795]: I0219 22:44:51.096810 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qr9lx"] Feb 19 22:44:52 crc kubenswrapper[4795]: I0219 22:44:52.007834 4795 generic.go:334] "Generic (PLEG): container finished" podID="566c3329-8a98-426c-a847-7bdf7df37653" containerID="9acf4103dbab2da287716a026f87427a75c0734ef6734fb45eb23664d9f962a3" exitCode=0 Feb 19 22:44:52 crc kubenswrapper[4795]: I0219 22:44:52.007876 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qr9lx" event={"ID":"566c3329-8a98-426c-a847-7bdf7df37653","Type":"ContainerDied","Data":"9acf4103dbab2da287716a026f87427a75c0734ef6734fb45eb23664d9f962a3"} Feb 19 22:44:52 crc kubenswrapper[4795]: I0219 22:44:52.007906 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qr9lx" event={"ID":"566c3329-8a98-426c-a847-7bdf7df37653","Type":"ContainerStarted","Data":"4b7825ca392d02ae5a4b56bbd0154b28a035e4c142dd0204cbcf271db6886746"} Feb 19 22:44:53 crc kubenswrapper[4795]: I0219 22:44:53.415314 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:53 crc kubenswrapper[4795]: I0219 22:44:53.580874 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/566c3329-8a98-426c-a847-7bdf7df37653-operator-scripts\") pod \"566c3329-8a98-426c-a847-7bdf7df37653\" (UID: \"566c3329-8a98-426c-a847-7bdf7df37653\") " Feb 19 22:44:53 crc kubenswrapper[4795]: I0219 22:44:53.580942 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7tc2\" (UniqueName: \"kubernetes.io/projected/566c3329-8a98-426c-a847-7bdf7df37653-kube-api-access-x7tc2\") pod \"566c3329-8a98-426c-a847-7bdf7df37653\" (UID: \"566c3329-8a98-426c-a847-7bdf7df37653\") " Feb 19 22:44:53 crc kubenswrapper[4795]: I0219 22:44:53.582086 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/566c3329-8a98-426c-a847-7bdf7df37653-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "566c3329-8a98-426c-a847-7bdf7df37653" (UID: "566c3329-8a98-426c-a847-7bdf7df37653"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:44:53 crc kubenswrapper[4795]: I0219 22:44:53.586827 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/566c3329-8a98-426c-a847-7bdf7df37653-kube-api-access-x7tc2" (OuterVolumeSpecName: "kube-api-access-x7tc2") pod "566c3329-8a98-426c-a847-7bdf7df37653" (UID: "566c3329-8a98-426c-a847-7bdf7df37653"). InnerVolumeSpecName "kube-api-access-x7tc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:44:53 crc kubenswrapper[4795]: I0219 22:44:53.682699 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/566c3329-8a98-426c-a847-7bdf7df37653-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:44:53 crc kubenswrapper[4795]: I0219 22:44:53.682738 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7tc2\" (UniqueName: \"kubernetes.io/projected/566c3329-8a98-426c-a847-7bdf7df37653-kube-api-access-x7tc2\") on node \"crc\" DevicePath \"\"" Feb 19 22:44:54 crc kubenswrapper[4795]: I0219 22:44:54.032013 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qr9lx" event={"ID":"566c3329-8a98-426c-a847-7bdf7df37653","Type":"ContainerDied","Data":"4b7825ca392d02ae5a4b56bbd0154b28a035e4c142dd0204cbcf271db6886746"} Feb 19 22:44:54 crc kubenswrapper[4795]: I0219 22:44:54.032324 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b7825ca392d02ae5a4b56bbd0154b28a035e4c142dd0204cbcf271db6886746" Feb 19 22:44:54 crc kubenswrapper[4795]: I0219 22:44:54.032102 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qr9lx" Feb 19 22:44:56 crc kubenswrapper[4795]: I0219 22:44:56.678378 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-qr9lx"] Feb 19 22:44:56 crc kubenswrapper[4795]: I0219 22:44:56.690742 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-qr9lx"] Feb 19 22:44:57 crc kubenswrapper[4795]: I0219 22:44:57.523008 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="566c3329-8a98-426c-a847-7bdf7df37653" path="/var/lib/kubelet/pods/566c3329-8a98-426c-a847-7bdf7df37653/volumes" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.156676 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd"] Feb 19 22:45:00 crc kubenswrapper[4795]: E0219 22:45:00.157388 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="566c3329-8a98-426c-a847-7bdf7df37653" containerName="mariadb-account-create-update" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.157407 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="566c3329-8a98-426c-a847-7bdf7df37653" containerName="mariadb-account-create-update" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.157693 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="566c3329-8a98-426c-a847-7bdf7df37653" containerName="mariadb-account-create-update" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.158361 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.163918 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.164375 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.173800 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd"] Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.304682 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54bacd9c-6bce-433c-972c-3990566baa40-config-volume\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.304726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54bacd9c-6bce-433c-972c-3990566baa40-secret-volume\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.304752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj8mp\" (UniqueName: \"kubernetes.io/projected/54bacd9c-6bce-433c-972c-3990566baa40-kube-api-access-tj8mp\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.406644 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54bacd9c-6bce-433c-972c-3990566baa40-config-volume\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.406944 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54bacd9c-6bce-433c-972c-3990566baa40-secret-volume\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.407045 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj8mp\" (UniqueName: \"kubernetes.io/projected/54bacd9c-6bce-433c-972c-3990566baa40-kube-api-access-tj8mp\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.407798 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54bacd9c-6bce-433c-972c-3990566baa40-config-volume\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.420416 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54bacd9c-6bce-433c-972c-3990566baa40-secret-volume\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.424230 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj8mp\" (UniqueName: \"kubernetes.io/projected/54bacd9c-6bce-433c-972c-3990566baa40-kube-api-access-tj8mp\") pod \"collect-profiles-29525685-qp6fd\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.521539 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:00 crc kubenswrapper[4795]: I0219 22:45:00.937844 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd"] Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.098939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" event={"ID":"54bacd9c-6bce-433c-972c-3990566baa40","Type":"ContainerStarted","Data":"eca52f37003a7a5168093ed1a2726d31a52843bd3e99b81128785c0ad70b60e9"} Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.099228 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" event={"ID":"54bacd9c-6bce-433c-972c-3990566baa40","Type":"ContainerStarted","Data":"2c2f5859aa5c96d4c7919768d2d7d44779670de8216920b069687351f3e33599"} Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.124526 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" podStartSLOduration=1.124506776 podStartE2EDuration="1.124506776s" podCreationTimestamp="2026-02-19 22:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:45:01.115763445 +0000 UTC m=+4612.308281349" watchObservedRunningTime="2026-02-19 22:45:01.124506776 +0000 UTC m=+4612.317024660" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.668622 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-vrr5x"] Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.669615 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.671565 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.683633 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vrr5x"] Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.729870 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa8dda8-f620-4331-8909-b10784ceeab8-operator-scripts\") pod \"root-account-create-update-vrr5x\" (UID: \"cfa8dda8-f620-4331-8909-b10784ceeab8\") " pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.729935 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5fbc\" (UniqueName: \"kubernetes.io/projected/cfa8dda8-f620-4331-8909-b10784ceeab8-kube-api-access-s5fbc\") pod \"root-account-create-update-vrr5x\" (UID: \"cfa8dda8-f620-4331-8909-b10784ceeab8\") " pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.831062 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa8dda8-f620-4331-8909-b10784ceeab8-operator-scripts\") pod \"root-account-create-update-vrr5x\" (UID: \"cfa8dda8-f620-4331-8909-b10784ceeab8\") " pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.831130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5fbc\" (UniqueName: \"kubernetes.io/projected/cfa8dda8-f620-4331-8909-b10784ceeab8-kube-api-access-s5fbc\") pod \"root-account-create-update-vrr5x\" (UID: \"cfa8dda8-f620-4331-8909-b10784ceeab8\") " pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.832553 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa8dda8-f620-4331-8909-b10784ceeab8-operator-scripts\") pod \"root-account-create-update-vrr5x\" (UID: \"cfa8dda8-f620-4331-8909-b10784ceeab8\") " pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.847943 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5fbc\" (UniqueName: \"kubernetes.io/projected/cfa8dda8-f620-4331-8909-b10784ceeab8-kube-api-access-s5fbc\") pod \"root-account-create-update-vrr5x\" (UID: \"cfa8dda8-f620-4331-8909-b10784ceeab8\") " pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:01 crc kubenswrapper[4795]: I0219 22:45:01.990587 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:02 crc kubenswrapper[4795]: I0219 22:45:02.109609 4795 generic.go:334] "Generic (PLEG): container finished" podID="54bacd9c-6bce-433c-972c-3990566baa40" containerID="eca52f37003a7a5168093ed1a2726d31a52843bd3e99b81128785c0ad70b60e9" exitCode=0 Feb 19 22:45:02 crc kubenswrapper[4795]: I0219 22:45:02.109649 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" event={"ID":"54bacd9c-6bce-433c-972c-3990566baa40","Type":"ContainerDied","Data":"eca52f37003a7a5168093ed1a2726d31a52843bd3e99b81128785c0ad70b60e9"} Feb 19 22:45:02 crc kubenswrapper[4795]: I0219 22:45:02.460995 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vrr5x"] Feb 19 22:45:02 crc kubenswrapper[4795]: W0219 22:45:02.465477 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfa8dda8_f620_4331_8909_b10784ceeab8.slice/crio-69b3b99567f4355711a9efa7f61e1c4d342b27993a8cb993e7e7f57b9a2da6c8 WatchSource:0}: Error finding container 69b3b99567f4355711a9efa7f61e1c4d342b27993a8cb993e7e7f57b9a2da6c8: Status 404 returned error can't find the container with id 69b3b99567f4355711a9efa7f61e1c4d342b27993a8cb993e7e7f57b9a2da6c8 Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.121730 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfa8dda8-f620-4331-8909-b10784ceeab8" containerID="595d4f05c4b7ec834570db4adf844a0fca41d1bed50151677fc612dd1b0457bc" exitCode=0 Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.121845 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vrr5x" event={"ID":"cfa8dda8-f620-4331-8909-b10784ceeab8","Type":"ContainerDied","Data":"595d4f05c4b7ec834570db4adf844a0fca41d1bed50151677fc612dd1b0457bc"} Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.121883 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vrr5x" event={"ID":"cfa8dda8-f620-4331-8909-b10784ceeab8","Type":"ContainerStarted","Data":"69b3b99567f4355711a9efa7f61e1c4d342b27993a8cb993e7e7f57b9a2da6c8"} Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.555322 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.561891 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj8mp\" (UniqueName: \"kubernetes.io/projected/54bacd9c-6bce-433c-972c-3990566baa40-kube-api-access-tj8mp\") pod \"54bacd9c-6bce-433c-972c-3990566baa40\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.561945 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54bacd9c-6bce-433c-972c-3990566baa40-config-volume\") pod \"54bacd9c-6bce-433c-972c-3990566baa40\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.561970 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54bacd9c-6bce-433c-972c-3990566baa40-secret-volume\") pod \"54bacd9c-6bce-433c-972c-3990566baa40\" (UID: \"54bacd9c-6bce-433c-972c-3990566baa40\") " Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.563442 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54bacd9c-6bce-433c-972c-3990566baa40-config-volume" (OuterVolumeSpecName: "config-volume") pod "54bacd9c-6bce-433c-972c-3990566baa40" (UID: "54bacd9c-6bce-433c-972c-3990566baa40"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.568341 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54bacd9c-6bce-433c-972c-3990566baa40-kube-api-access-tj8mp" (OuterVolumeSpecName: "kube-api-access-tj8mp") pod "54bacd9c-6bce-433c-972c-3990566baa40" (UID: "54bacd9c-6bce-433c-972c-3990566baa40"). InnerVolumeSpecName "kube-api-access-tj8mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.568343 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54bacd9c-6bce-433c-972c-3990566baa40-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "54bacd9c-6bce-433c-972c-3990566baa40" (UID: "54bacd9c-6bce-433c-972c-3990566baa40"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.664493 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj8mp\" (UniqueName: \"kubernetes.io/projected/54bacd9c-6bce-433c-972c-3990566baa40-kube-api-access-tj8mp\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.664544 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54bacd9c-6bce-433c-972c-3990566baa40-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:03 crc kubenswrapper[4795]: I0219 22:45:03.664568 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54bacd9c-6bce-433c-972c-3990566baa40-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.133990 4795 generic.go:334] "Generic (PLEG): container finished" podID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerID="48d9ac5474c94a11a6da74ec61c602260d05f7961e1a831acf6b59d650ddb95f" exitCode=0 Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.134104 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268","Type":"ContainerDied","Data":"48d9ac5474c94a11a6da74ec61c602260d05f7961e1a831acf6b59d650ddb95f"} Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.136607 4795 generic.go:334] "Generic (PLEG): container finished" podID="9cd04173-2975-46bd-8602-f6561387d717" containerID="ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a" exitCode=0 Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.136719 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9cd04173-2975-46bd-8602-f6561387d717","Type":"ContainerDied","Data":"ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a"} Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.139122 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.139117 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd" event={"ID":"54bacd9c-6bce-433c-972c-3990566baa40","Type":"ContainerDied","Data":"2c2f5859aa5c96d4c7919768d2d7d44779670de8216920b069687351f3e33599"} Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.139351 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c2f5859aa5c96d4c7919768d2d7d44779670de8216920b069687351f3e33599" Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.379841 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.473497 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5fbc\" (UniqueName: \"kubernetes.io/projected/cfa8dda8-f620-4331-8909-b10784ceeab8-kube-api-access-s5fbc\") pod \"cfa8dda8-f620-4331-8909-b10784ceeab8\" (UID: \"cfa8dda8-f620-4331-8909-b10784ceeab8\") " Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.473780 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa8dda8-f620-4331-8909-b10784ceeab8-operator-scripts\") pod \"cfa8dda8-f620-4331-8909-b10784ceeab8\" (UID: \"cfa8dda8-f620-4331-8909-b10784ceeab8\") " Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.474234 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfa8dda8-f620-4331-8909-b10784ceeab8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cfa8dda8-f620-4331-8909-b10784ceeab8" (UID: "cfa8dda8-f620-4331-8909-b10784ceeab8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.476528 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa8dda8-f620-4331-8909-b10784ceeab8-kube-api-access-s5fbc" (OuterVolumeSpecName: "kube-api-access-s5fbc") pod "cfa8dda8-f620-4331-8909-b10784ceeab8" (UID: "cfa8dda8-f620-4331-8909-b10784ceeab8"). InnerVolumeSpecName "kube-api-access-s5fbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.574869 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5fbc\" (UniqueName: \"kubernetes.io/projected/cfa8dda8-f620-4331-8909-b10784ceeab8-kube-api-access-s5fbc\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.574903 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa8dda8-f620-4331-8909-b10784ceeab8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.628655 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt"] Feb 19 22:45:04 crc kubenswrapper[4795]: I0219 22:45:04.635053 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525640-x6xrt"] Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.154840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268","Type":"ContainerStarted","Data":"5acc4978acac0af934a6a762d2d3737171002b4cd362487321c37a56c03b3889"} Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.155035 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.156331 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vrr5x" Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.156371 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vrr5x" event={"ID":"cfa8dda8-f620-4331-8909-b10784ceeab8","Type":"ContainerDied","Data":"69b3b99567f4355711a9efa7f61e1c4d342b27993a8cb993e7e7f57b9a2da6c8"} Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.156423 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69b3b99567f4355711a9efa7f61e1c4d342b27993a8cb993e7e7f57b9a2da6c8" Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.158485 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9cd04173-2975-46bd-8602-f6561387d717","Type":"ContainerStarted","Data":"193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01"} Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.158690 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.183659 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.183634332 podStartE2EDuration="37.183634332s" podCreationTimestamp="2026-02-19 22:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:45:05.179266657 +0000 UTC m=+4616.371784541" watchObservedRunningTime="2026-02-19 22:45:05.183634332 +0000 UTC m=+4616.376152196" Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.208457 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.208439937 podStartE2EDuration="37.208439937s" podCreationTimestamp="2026-02-19 22:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:45:05.206301765 +0000 UTC m=+4616.398819629" watchObservedRunningTime="2026-02-19 22:45:05.208439937 +0000 UTC m=+4616.400957801" Feb 19 22:45:05 crc kubenswrapper[4795]: I0219 22:45:05.523066 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6bba469-9e7c-4517-bc8d-2d5a5308edef" path="/var/lib/kubelet/pods/b6bba469-9e7c-4517-bc8d-2d5a5308edef/volumes" Feb 19 22:45:14 crc kubenswrapper[4795]: I0219 22:45:14.781248 4795 scope.go:117] "RemoveContainer" containerID="32bd4a96cb785a6511c7d0788e6684b8612ce255e3204d11b76077aa3fe75418" Feb 19 22:45:19 crc kubenswrapper[4795]: I0219 22:45:19.958327 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 22:45:20 crc kubenswrapper[4795]: I0219 22:45:20.246156 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.634906 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-wgn4d"] Feb 19 22:45:22 crc kubenswrapper[4795]: E0219 22:45:22.636300 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa8dda8-f620-4331-8909-b10784ceeab8" containerName="mariadb-account-create-update" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.636460 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa8dda8-f620-4331-8909-b10784ceeab8" containerName="mariadb-account-create-update" Feb 19 22:45:22 crc kubenswrapper[4795]: E0219 22:45:22.636607 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54bacd9c-6bce-433c-972c-3990566baa40" containerName="collect-profiles" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.636725 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="54bacd9c-6bce-433c-972c-3990566baa40" containerName="collect-profiles" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.637105 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa8dda8-f620-4331-8909-b10784ceeab8" containerName="mariadb-account-create-update" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.637304 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="54bacd9c-6bce-433c-972c-3990566baa40" containerName="collect-profiles" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.638809 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.647219 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-wgn4d"] Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.781988 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-dns-svc\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.782063 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpmqb\" (UniqueName: \"kubernetes.io/projected/00c8c1c0-da57-4169-a42c-b52386ed3112-kube-api-access-vpmqb\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.782148 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-config\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.883360 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-config\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.883538 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-dns-svc\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.883568 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpmqb\" (UniqueName: \"kubernetes.io/projected/00c8c1c0-da57-4169-a42c-b52386ed3112-kube-api-access-vpmqb\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.884212 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-config\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.884460 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-dns-svc\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.918225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpmqb\" (UniqueName: \"kubernetes.io/projected/00c8c1c0-da57-4169-a42c-b52386ed3112-kube-api-access-vpmqb\") pod \"dnsmasq-dns-54dc9c94cc-wgn4d\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:22 crc kubenswrapper[4795]: I0219 22:45:22.958481 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:23 crc kubenswrapper[4795]: I0219 22:45:23.314773 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:45:23 crc kubenswrapper[4795]: I0219 22:45:23.380888 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-wgn4d"] Feb 19 22:45:23 crc kubenswrapper[4795]: I0219 22:45:23.971357 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:45:24 crc kubenswrapper[4795]: I0219 22:45:24.309200 4795 generic.go:334] "Generic (PLEG): container finished" podID="00c8c1c0-da57-4169-a42c-b52386ed3112" containerID="99d134900f88fca6c416350229ec6ebd6aa32403658dd7fa964885fc2b39579a" exitCode=0 Feb 19 22:45:24 crc kubenswrapper[4795]: I0219 22:45:24.309248 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" event={"ID":"00c8c1c0-da57-4169-a42c-b52386ed3112","Type":"ContainerDied","Data":"99d134900f88fca6c416350229ec6ebd6aa32403658dd7fa964885fc2b39579a"} Feb 19 22:45:24 crc kubenswrapper[4795]: I0219 22:45:24.309279 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" event={"ID":"00c8c1c0-da57-4169-a42c-b52386ed3112","Type":"ContainerStarted","Data":"188ca574ffaf1ffa388763be3f13a7eb4afedf6a896f0e7de263093402b89351"} Feb 19 22:45:25 crc kubenswrapper[4795]: I0219 22:45:25.258055 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="9cd04173-2975-46bd-8602-f6561387d717" containerName="rabbitmq" containerID="cri-o://193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01" gracePeriod=604799 Feb 19 22:45:25 crc kubenswrapper[4795]: I0219 22:45:25.317861 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" event={"ID":"00c8c1c0-da57-4169-a42c-b52386ed3112","Type":"ContainerStarted","Data":"00f889ab0dea3cf5542a0ac31c64828fab8a4969385f74afc960d2da8d71af69"} Feb 19 22:45:25 crc kubenswrapper[4795]: I0219 22:45:25.319098 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:25 crc kubenswrapper[4795]: I0219 22:45:25.340216 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" podStartSLOduration=3.340197004 podStartE2EDuration="3.340197004s" podCreationTimestamp="2026-02-19 22:45:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:45:25.333985115 +0000 UTC m=+4636.526502979" watchObservedRunningTime="2026-02-19 22:45:25.340197004 +0000 UTC m=+4636.532714868" Feb 19 22:45:25 crc kubenswrapper[4795]: I0219 22:45:25.960812 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerName="rabbitmq" containerID="cri-o://5acc4978acac0af934a6a762d2d3737171002b4cd362487321c37a56c03b3889" gracePeriod=604799 Feb 19 22:45:29 crc kubenswrapper[4795]: I0219 22:45:29.956842 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="9cd04173-2975-46bd-8602-f6561387d717" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.243:5672: connect: connection refused" Feb 19 22:45:30 crc kubenswrapper[4795]: I0219 22:45:30.244358 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.244:5672: connect: connection refused" Feb 19 22:45:31 crc kubenswrapper[4795]: I0219 22:45:31.913842 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029248 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9cd04173-2975-46bd-8602-f6561387d717-erlang-cookie-secret\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029450 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029495 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-plugins\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029534 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-confd\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029584 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-plugins-conf\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029600 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-server-conf\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029621 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9cd04173-2975-46bd-8602-f6561387d717-pod-info\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029693 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-erlang-cookie\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.029742 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnbmk\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-kube-api-access-wnbmk\") pod \"9cd04173-2975-46bd-8602-f6561387d717\" (UID: \"9cd04173-2975-46bd-8602-f6561387d717\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.030617 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.031022 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.031060 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.034988 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-kube-api-access-wnbmk" (OuterVolumeSpecName: "kube-api-access-wnbmk") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "kube-api-access-wnbmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.035415 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd04173-2975-46bd-8602-f6561387d717-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.036231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9cd04173-2975-46bd-8602-f6561387d717-pod-info" (OuterVolumeSpecName: "pod-info") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.053860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-server-conf" (OuterVolumeSpecName: "server-conf") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.059366 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f" (OuterVolumeSpecName: "persistence") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.111645 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9cd04173-2975-46bd-8602-f6561387d717" (UID: "9cd04173-2975-46bd-8602-f6561387d717"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131694 4795 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9cd04173-2975-46bd-8602-f6561387d717-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131806 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") on node \"crc\" " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131826 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131840 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131854 4795 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131864 4795 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9cd04173-2975-46bd-8602-f6561387d717-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131874 4795 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9cd04173-2975-46bd-8602-f6561387d717-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131887 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9cd04173-2975-46bd-8602-f6561387d717-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.131899 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnbmk\" (UniqueName: \"kubernetes.io/projected/9cd04173-2975-46bd-8602-f6561387d717-kube-api-access-wnbmk\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.146905 4795 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.147442 4795 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f") on node "crc" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.233253 4795 reconciler_common.go:293] "Volume detached for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.377179 4795 generic.go:334] "Generic (PLEG): container finished" podID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerID="5acc4978acac0af934a6a762d2d3737171002b4cd362487321c37a56c03b3889" exitCode=0 Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.377242 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268","Type":"ContainerDied","Data":"5acc4978acac0af934a6a762d2d3737171002b4cd362487321c37a56c03b3889"} Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.379119 4795 generic.go:334] "Generic (PLEG): container finished" podID="9cd04173-2975-46bd-8602-f6561387d717" containerID="193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01" exitCode=0 Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.379280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9cd04173-2975-46bd-8602-f6561387d717","Type":"ContainerDied","Data":"193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01"} Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.379350 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9cd04173-2975-46bd-8602-f6561387d717","Type":"ContainerDied","Data":"8ac632b756081c55272b5e493bca0401eb5a5d61edfe7cea78f2545acf8b31bd"} Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.379371 4795 scope.go:117] "RemoveContainer" containerID="193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.379570 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.404838 4795 scope.go:117] "RemoveContainer" containerID="ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.427659 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.449455 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.468845 4795 scope.go:117] "RemoveContainer" containerID="193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.468988 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:45:32 crc kubenswrapper[4795]: E0219 22:45:32.469432 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd04173-2975-46bd-8602-f6561387d717" containerName="rabbitmq" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.469457 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd04173-2975-46bd-8602-f6561387d717" containerName="rabbitmq" Feb 19 22:45:32 crc kubenswrapper[4795]: E0219 22:45:32.469472 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd04173-2975-46bd-8602-f6561387d717" containerName="setup-container" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.469491 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd04173-2975-46bd-8602-f6561387d717" containerName="setup-container" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.469695 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd04173-2975-46bd-8602-f6561387d717" containerName="rabbitmq" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.470812 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.471541 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.474440 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.476259 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.476400 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.476620 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rbx8c" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.476733 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 22:45:32 crc kubenswrapper[4795]: E0219 22:45:32.477443 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01\": container with ID starting with 193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01 not found: ID does not exist" containerID="193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.477504 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01"} err="failed to get container status \"193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01\": rpc error: code = NotFound desc = could not find container \"193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01\": container with ID starting with 193aac651e1b8dba4dea79907fcffd2ee3b656e5be68313f2c158376eeedde01 not found: ID does not exist" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.477540 4795 scope.go:117] "RemoveContainer" containerID="ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a" Feb 19 22:45:32 crc kubenswrapper[4795]: E0219 22:45:32.477922 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a\": container with ID starting with ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a not found: ID does not exist" containerID="ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.477953 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a"} err="failed to get container status \"ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a\": rpc error: code = NotFound desc = could not find container \"ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a\": container with ID starting with ef3bf36d2dad06468cc4add7eb9c2429b178c38643511387b593eff67aac784a not found: ID does not exist" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.546996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.547048 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.547106 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.547132 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cfzv\" (UniqueName: \"kubernetes.io/projected/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-kube-api-access-8cfzv\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.547182 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.547208 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.547254 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.547276 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.547291 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.606476 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.648458 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-confd\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.648572 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-plugins-conf\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.648609 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-pod-info\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.648681 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24vlt\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-kube-api-access-24vlt\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.649414 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-server-conf\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.649468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-erlang-cookie\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.649503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-erlang-cookie-secret\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.649533 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-plugins\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.649674 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\" (UID: \"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268\") " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.649878 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.649907 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650002 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650097 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650218 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650242 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cfzv\" (UniqueName: \"kubernetes.io/projected/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-kube-api-access-8cfzv\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650330 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.649416 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650337 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650364 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650560 4795 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.650586 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.651232 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.651308 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.651719 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.652341 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.652570 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.653483 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-pod-info" (OuterVolumeSpecName: "pod-info") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.655349 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.655628 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.655718 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/68b77c70021788ffaf78dfe86ddece9c7d3d5c9cffb40f46dc15f8b79fc094aa/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.661382 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.662997 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-kube-api-access-24vlt" (OuterVolumeSpecName: "kube-api-access-24vlt") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "kube-api-access-24vlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.670078 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.671863 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb" (OuterVolumeSpecName: "persistence") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.672046 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.674275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cfzv\" (UniqueName: \"kubernetes.io/projected/8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b-kube-api-access-8cfzv\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.699619 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-02d3a85d-dead-46fd-a32b-ccf61c027c3f\") pod \"rabbitmq-server-0\" (UID: \"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b\") " pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.709066 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-server-conf" (OuterVolumeSpecName: "server-conf") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.752268 4795 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.752307 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24vlt\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-kube-api-access-24vlt\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.752322 4795 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.752333 4795 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.752345 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.752381 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") on node \"crc\" " Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.765368 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" (UID: "bd43ca2d-c7e3-4fc7-84a5-74b50cadd268"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.770136 4795 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.770311 4795 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb") on node "crc" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.791231 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.853558 4795 reconciler_common.go:293] "Volume detached for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:32 crc kubenswrapper[4795]: I0219 22:45:32.853814 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:32.962240 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.015731 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-vf5s4"] Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.016039 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" podUID="cc119db7-03db-4838-b663-f244b7f93433" containerName="dnsmasq-dns" containerID="cri-o://64136e69fe434b4b0bfaa8f633f2393606df65c80063c9e28ce2ef61c86da556" gracePeriod=10 Feb 19 22:45:33 crc kubenswrapper[4795]: E0219 22:45:33.189503 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc119db7_03db_4838_b663_f244b7f93433.slice/crio-conmon-64136e69fe434b4b0bfaa8f633f2393606df65c80063c9e28ce2ef61c86da556.scope\": RecentStats: unable to find data in memory cache]" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.193428 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.394037 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b","Type":"ContainerStarted","Data":"25a4dd5a66aafe90ee24944e63798abc6d2b3f2338aa1a74cc4b99fcedce95a9"} Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.397379 4795 generic.go:334] "Generic (PLEG): container finished" podID="cc119db7-03db-4838-b663-f244b7f93433" containerID="64136e69fe434b4b0bfaa8f633f2393606df65c80063c9e28ce2ef61c86da556" exitCode=0 Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.397466 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" event={"ID":"cc119db7-03db-4838-b663-f244b7f93433","Type":"ContainerDied","Data":"64136e69fe434b4b0bfaa8f633f2393606df65c80063c9e28ce2ef61c86da556"} Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.400273 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.400311 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd43ca2d-c7e3-4fc7-84a5-74b50cadd268","Type":"ContainerDied","Data":"e602b9b95a59c91b0341f4b50dac51b129fce0661e01cc211b5bddd7e49efb66"} Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.400392 4795 scope.go:117] "RemoveContainer" containerID="5acc4978acac0af934a6a762d2d3737171002b4cd362487321c37a56c03b3889" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.438858 4795 scope.go:117] "RemoveContainer" containerID="48d9ac5474c94a11a6da74ec61c602260d05f7961e1a831acf6b59d650ddb95f" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.440950 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.446738 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.467576 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:45:33 crc kubenswrapper[4795]: E0219 22:45:33.467856 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerName="rabbitmq" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.467870 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerName="rabbitmq" Feb 19 22:45:33 crc kubenswrapper[4795]: E0219 22:45:33.467887 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerName="setup-container" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.467895 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerName="setup-container" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.468025 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" containerName="rabbitmq" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.468786 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.471783 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.471876 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.471931 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rh88g" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.472034 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.472108 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.537130 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cd04173-2975-46bd-8602-f6561387d717" path="/var/lib/kubelet/pods/9cd04173-2975-46bd-8602-f6561387d717/volumes" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.538535 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd43ca2d-c7e3-4fc7-84a5-74b50cadd268" path="/var/lib/kubelet/pods/bd43ca2d-c7e3-4fc7-84a5-74b50cadd268/volumes" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.539042 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.565533 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ce76f8f5-4383-4be1-ab7b-cf862ae77025-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.565577 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ce76f8f5-4383-4be1-ab7b-cf862ae77025-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.565602 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.565620 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.565856 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.565878 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ce76f8f5-4383-4be1-ab7b-cf862ae77025-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.565977 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.566008 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-png8h\" (UniqueName: \"kubernetes.io/projected/ce76f8f5-4383-4be1-ab7b-cf862ae77025-kube-api-access-png8h\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.566030 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ce76f8f5-4383-4be1-ab7b-cf862ae77025-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668222 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668292 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-png8h\" (UniqueName: \"kubernetes.io/projected/ce76f8f5-4383-4be1-ab7b-cf862ae77025-kube-api-access-png8h\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ce76f8f5-4383-4be1-ab7b-cf862ae77025-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ce76f8f5-4383-4be1-ab7b-cf862ae77025-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ce76f8f5-4383-4be1-ab7b-cf862ae77025-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668472 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668490 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668553 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.668575 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ce76f8f5-4383-4be1-ab7b-cf862ae77025-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.669790 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ce76f8f5-4383-4be1-ab7b-cf862ae77025-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.670297 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.670482 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.671402 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ce76f8f5-4383-4be1-ab7b-cf862ae77025-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.673132 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.673152 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ce7e4e38eafbc5e4f450d6daa37e1c166de00b3db9bc92a3ff92b374f626b390/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.676885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ce76f8f5-4383-4be1-ab7b-cf862ae77025-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.679937 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ce76f8f5-4383-4be1-ab7b-cf862ae77025-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.683680 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ce76f8f5-4383-4be1-ab7b-cf862ae77025-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.694586 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-png8h\" (UniqueName: \"kubernetes.io/projected/ce76f8f5-4383-4be1-ab7b-cf862ae77025-kube-api-access-png8h\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.762222 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48a63ad-4da2-47a4-b3c8-cad5fc9646bb\") pod \"rabbitmq-cell1-server-0\" (UID: \"ce76f8f5-4383-4be1-ab7b-cf862ae77025\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.840796 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.950209 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.972975 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-dns-svc\") pod \"cc119db7-03db-4838-b663-f244b7f93433\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.973615 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-config\") pod \"cc119db7-03db-4838-b663-f244b7f93433\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.973763 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-784t4\" (UniqueName: \"kubernetes.io/projected/cc119db7-03db-4838-b663-f244b7f93433-kube-api-access-784t4\") pod \"cc119db7-03db-4838-b663-f244b7f93433\" (UID: \"cc119db7-03db-4838-b663-f244b7f93433\") " Feb 19 22:45:33 crc kubenswrapper[4795]: I0219 22:45:33.978586 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc119db7-03db-4838-b663-f244b7f93433-kube-api-access-784t4" (OuterVolumeSpecName: "kube-api-access-784t4") pod "cc119db7-03db-4838-b663-f244b7f93433" (UID: "cc119db7-03db-4838-b663-f244b7f93433"). InnerVolumeSpecName "kube-api-access-784t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.007013 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc119db7-03db-4838-b663-f244b7f93433" (UID: "cc119db7-03db-4838-b663-f244b7f93433"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.009983 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-config" (OuterVolumeSpecName: "config") pod "cc119db7-03db-4838-b663-f244b7f93433" (UID: "cc119db7-03db-4838-b663-f244b7f93433"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.076989 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.077032 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc119db7-03db-4838-b663-f244b7f93433-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.077046 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-784t4\" (UniqueName: \"kubernetes.io/projected/cc119db7-03db-4838-b663-f244b7f93433-kube-api-access-784t4\") on node \"crc\" DevicePath \"\"" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.282581 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 22:45:34 crc kubenswrapper[4795]: W0219 22:45:34.286205 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce76f8f5_4383_4be1_ab7b_cf862ae77025.slice/crio-da4bb88a1ed67001533a76eba8ca9889a21fe46aab0f553e17159712c917c609 WatchSource:0}: Error finding container da4bb88a1ed67001533a76eba8ca9889a21fe46aab0f553e17159712c917c609: Status 404 returned error can't find the container with id da4bb88a1ed67001533a76eba8ca9889a21fe46aab0f553e17159712c917c609 Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.421524 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b","Type":"ContainerStarted","Data":"95a2af03cbe9d22b98dd6b3c065e4321753cc8882241a21052ddddc4c34902ad"} Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.425205 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.425197 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-vf5s4" event={"ID":"cc119db7-03db-4838-b663-f244b7f93433","Type":"ContainerDied","Data":"8e504bc8e8afcb8d881cb268684b0b1a7bedc4275f260a1ee4b205b460ca8f3e"} Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.425467 4795 scope.go:117] "RemoveContainer" containerID="64136e69fe434b4b0bfaa8f633f2393606df65c80063c9e28ce2ef61c86da556" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.428779 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ce76f8f5-4383-4be1-ab7b-cf862ae77025","Type":"ContainerStarted","Data":"da4bb88a1ed67001533a76eba8ca9889a21fe46aab0f553e17159712c917c609"} Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.450824 4795 scope.go:117] "RemoveContainer" containerID="6c19182bc802a37ab3a253eaa33c548a0f3f15e1ee3c00a8c016db959a5b8557" Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.486291 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-vf5s4"] Feb 19 22:45:34 crc kubenswrapper[4795]: I0219 22:45:34.494038 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-vf5s4"] Feb 19 22:45:35 crc kubenswrapper[4795]: I0219 22:45:35.436532 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ce76f8f5-4383-4be1-ab7b-cf862ae77025","Type":"ContainerStarted","Data":"28a8d402d4f21cba268822aeb50ee538ba8e4a32a3d16536eaf6ded74f5bbc84"} Feb 19 22:45:35 crc kubenswrapper[4795]: I0219 22:45:35.521066 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc119db7-03db-4838-b663-f244b7f93433" path="/var/lib/kubelet/pods/cc119db7-03db-4838-b663-f244b7f93433/volumes" Feb 19 22:46:07 crc kubenswrapper[4795]: I0219 22:46:07.703919 4795 generic.go:334] "Generic (PLEG): container finished" podID="8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b" containerID="95a2af03cbe9d22b98dd6b3c065e4321753cc8882241a21052ddddc4c34902ad" exitCode=0 Feb 19 22:46:07 crc kubenswrapper[4795]: I0219 22:46:07.704018 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b","Type":"ContainerDied","Data":"95a2af03cbe9d22b98dd6b3c065e4321753cc8882241a21052ddddc4c34902ad"} Feb 19 22:46:08 crc kubenswrapper[4795]: I0219 22:46:08.712320 4795 generic.go:334] "Generic (PLEG): container finished" podID="ce76f8f5-4383-4be1-ab7b-cf862ae77025" containerID="28a8d402d4f21cba268822aeb50ee538ba8e4a32a3d16536eaf6ded74f5bbc84" exitCode=0 Feb 19 22:46:08 crc kubenswrapper[4795]: I0219 22:46:08.712397 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ce76f8f5-4383-4be1-ab7b-cf862ae77025","Type":"ContainerDied","Data":"28a8d402d4f21cba268822aeb50ee538ba8e4a32a3d16536eaf6ded74f5bbc84"} Feb 19 22:46:08 crc kubenswrapper[4795]: I0219 22:46:08.714606 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b","Type":"ContainerStarted","Data":"97689d1d9792b12996c2fc3af997e9b6bd43b1de8113bf314329bc0b6ade5aff"} Feb 19 22:46:08 crc kubenswrapper[4795]: I0219 22:46:08.714826 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 22:46:08 crc kubenswrapper[4795]: I0219 22:46:08.766380 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.766361029 podStartE2EDuration="36.766361029s" podCreationTimestamp="2026-02-19 22:45:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:46:08.762138817 +0000 UTC m=+4679.954656721" watchObservedRunningTime="2026-02-19 22:46:08.766361029 +0000 UTC m=+4679.958878893" Feb 19 22:46:09 crc kubenswrapper[4795]: I0219 22:46:09.726046 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ce76f8f5-4383-4be1-ab7b-cf862ae77025","Type":"ContainerStarted","Data":"2249af915dfcd97ac2ae5ba18ac2a704f0a02d994866f1cbb860ad37ee70a32d"} Feb 19 22:46:09 crc kubenswrapper[4795]: I0219 22:46:09.727507 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:46:09 crc kubenswrapper[4795]: I0219 22:46:09.753827 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.753805986 podStartE2EDuration="36.753805986s" podCreationTimestamp="2026-02-19 22:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:46:09.753730614 +0000 UTC m=+4680.946248528" watchObservedRunningTime="2026-02-19 22:46:09.753805986 +0000 UTC m=+4680.946323870" Feb 19 22:46:22 crc kubenswrapper[4795]: I0219 22:46:22.794378 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 22:46:23 crc kubenswrapper[4795]: I0219 22:46:23.843642 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.286074 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 19 22:46:32 crc kubenswrapper[4795]: E0219 22:46:32.286816 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc119db7-03db-4838-b663-f244b7f93433" containerName="init" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.286830 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc119db7-03db-4838-b663-f244b7f93433" containerName="init" Feb 19 22:46:32 crc kubenswrapper[4795]: E0219 22:46:32.286842 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc119db7-03db-4838-b663-f244b7f93433" containerName="dnsmasq-dns" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.286848 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc119db7-03db-4838-b663-f244b7f93433" containerName="dnsmasq-dns" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.287001 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc119db7-03db-4838-b663-f244b7f93433" containerName="dnsmasq-dns" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.287530 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.290786 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-p5fxh" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.297141 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.428676 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2pf5\" (UniqueName: \"kubernetes.io/projected/56d10cf2-06d6-4709-a9b9-1b88eb3d6304-kube-api-access-k2pf5\") pod \"mariadb-client\" (UID: \"56d10cf2-06d6-4709-a9b9-1b88eb3d6304\") " pod="openstack/mariadb-client" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.529520 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2pf5\" (UniqueName: \"kubernetes.io/projected/56d10cf2-06d6-4709-a9b9-1b88eb3d6304-kube-api-access-k2pf5\") pod \"mariadb-client\" (UID: \"56d10cf2-06d6-4709-a9b9-1b88eb3d6304\") " pod="openstack/mariadb-client" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.627848 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2pf5\" (UniqueName: \"kubernetes.io/projected/56d10cf2-06d6-4709-a9b9-1b88eb3d6304-kube-api-access-k2pf5\") pod \"mariadb-client\" (UID: \"56d10cf2-06d6-4709-a9b9-1b88eb3d6304\") " pod="openstack/mariadb-client" Feb 19 22:46:32 crc kubenswrapper[4795]: I0219 22:46:32.906262 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:46:33 crc kubenswrapper[4795]: I0219 22:46:33.479018 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:46:33 crc kubenswrapper[4795]: W0219 22:46:33.485383 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56d10cf2_06d6_4709_a9b9_1b88eb3d6304.slice/crio-372e10d7a38b279457b4e38e92f499f0063bfb07322a103b1d04a880c05a4be9 WatchSource:0}: Error finding container 372e10d7a38b279457b4e38e92f499f0063bfb07322a103b1d04a880c05a4be9: Status 404 returned error can't find the container with id 372e10d7a38b279457b4e38e92f499f0063bfb07322a103b1d04a880c05a4be9 Feb 19 22:46:33 crc kubenswrapper[4795]: I0219 22:46:33.910771 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"56d10cf2-06d6-4709-a9b9-1b88eb3d6304","Type":"ContainerStarted","Data":"372e10d7a38b279457b4e38e92f499f0063bfb07322a103b1d04a880c05a4be9"} Feb 19 22:46:34 crc kubenswrapper[4795]: I0219 22:46:34.921446 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"56d10cf2-06d6-4709-a9b9-1b88eb3d6304","Type":"ContainerStarted","Data":"6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686"} Feb 19 22:46:34 crc kubenswrapper[4795]: I0219 22:46:34.941148 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.495445663 podStartE2EDuration="2.941119957s" podCreationTimestamp="2026-02-19 22:46:32 +0000 UTC" firstStartedPulling="2026-02-19 22:46:33.486789846 +0000 UTC m=+4704.679307710" lastFinishedPulling="2026-02-19 22:46:33.93246415 +0000 UTC m=+4705.124982004" observedRunningTime="2026-02-19 22:46:34.938158142 +0000 UTC m=+4706.130676066" watchObservedRunningTime="2026-02-19 22:46:34.941119957 +0000 UTC m=+4706.133637861" Feb 19 22:46:50 crc kubenswrapper[4795]: I0219 22:46:50.580224 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:46:50 crc kubenswrapper[4795]: I0219 22:46:50.581017 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="56d10cf2-06d6-4709-a9b9-1b88eb3d6304" containerName="mariadb-client" containerID="cri-o://6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686" gracePeriod=30 Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.026392 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.031686 4795 generic.go:334] "Generic (PLEG): container finished" podID="56d10cf2-06d6-4709-a9b9-1b88eb3d6304" containerID="6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686" exitCode=143 Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.031727 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"56d10cf2-06d6-4709-a9b9-1b88eb3d6304","Type":"ContainerDied","Data":"6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686"} Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.031757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"56d10cf2-06d6-4709-a9b9-1b88eb3d6304","Type":"ContainerDied","Data":"372e10d7a38b279457b4e38e92f499f0063bfb07322a103b1d04a880c05a4be9"} Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.031758 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.031778 4795 scope.go:117] "RemoveContainer" containerID="6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686" Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.059394 4795 scope.go:117] "RemoveContainer" containerID="6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686" Feb 19 22:46:51 crc kubenswrapper[4795]: E0219 22:46:51.059964 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686\": container with ID starting with 6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686 not found: ID does not exist" containerID="6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686" Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.059993 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686"} err="failed to get container status \"6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686\": rpc error: code = NotFound desc = could not find container \"6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686\": container with ID starting with 6e6b9b06030c29641fb41afc58b73a20301d29aabf850b0e31fe1b9909cd9686 not found: ID does not exist" Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.130990 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2pf5\" (UniqueName: \"kubernetes.io/projected/56d10cf2-06d6-4709-a9b9-1b88eb3d6304-kube-api-access-k2pf5\") pod \"56d10cf2-06d6-4709-a9b9-1b88eb3d6304\" (UID: \"56d10cf2-06d6-4709-a9b9-1b88eb3d6304\") " Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.136946 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56d10cf2-06d6-4709-a9b9-1b88eb3d6304-kube-api-access-k2pf5" (OuterVolumeSpecName: "kube-api-access-k2pf5") pod "56d10cf2-06d6-4709-a9b9-1b88eb3d6304" (UID: "56d10cf2-06d6-4709-a9b9-1b88eb3d6304"). InnerVolumeSpecName "kube-api-access-k2pf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.232533 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2pf5\" (UniqueName: \"kubernetes.io/projected/56d10cf2-06d6-4709-a9b9-1b88eb3d6304-kube-api-access-k2pf5\") on node \"crc\" DevicePath \"\"" Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.366950 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.373390 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:46:51 crc kubenswrapper[4795]: I0219 22:46:51.521542 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56d10cf2-06d6-4709-a9b9-1b88eb3d6304" path="/var/lib/kubelet/pods/56d10cf2-06d6-4709-a9b9-1b88eb3d6304/volumes" Feb 19 22:46:58 crc kubenswrapper[4795]: I0219 22:46:58.427610 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:46:58 crc kubenswrapper[4795]: I0219 22:46:58.428573 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:47:14 crc kubenswrapper[4795]: I0219 22:47:14.941301 4795 scope.go:117] "RemoveContainer" containerID="fa2e3c7da6ac02ed85489de382e40a9b438bc98edd6481546fefa8394b7e5fce" Feb 19 22:47:28 crc kubenswrapper[4795]: I0219 22:47:28.427848 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:47:28 crc kubenswrapper[4795]: I0219 22:47:28.428666 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.428113 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.428965 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.429038 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.430061 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85a7b1a9bb1dc26b66551125ca13bcd3b5f6f4015b61adf891b31e1f3b13c640"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.430208 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://85a7b1a9bb1dc26b66551125ca13bcd3b5f6f4015b61adf891b31e1f3b13c640" gracePeriod=600 Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.611616 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="85a7b1a9bb1dc26b66551125ca13bcd3b5f6f4015b61adf891b31e1f3b13c640" exitCode=0 Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.611699 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"85a7b1a9bb1dc26b66551125ca13bcd3b5f6f4015b61adf891b31e1f3b13c640"} Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.612127 4795 scope.go:117] "RemoveContainer" containerID="2e1a23cb785a07f6fbead34d3a07564cd041d28b6cd22a143a851674ad80e799" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.955855 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rc44b"] Feb 19 22:47:58 crc kubenswrapper[4795]: E0219 22:47:58.956541 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d10cf2-06d6-4709-a9b9-1b88eb3d6304" containerName="mariadb-client" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.956571 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d10cf2-06d6-4709-a9b9-1b88eb3d6304" containerName="mariadb-client" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.956848 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="56d10cf2-06d6-4709-a9b9-1b88eb3d6304" containerName="mariadb-client" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.959549 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:58 crc kubenswrapper[4795]: I0219 22:47:58.971203 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rc44b"] Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.075455 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-utilities\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.075513 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnb5b\" (UniqueName: \"kubernetes.io/projected/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-kube-api-access-mnb5b\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.075576 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-catalog-content\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.176483 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-utilities\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.176773 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnb5b\" (UniqueName: \"kubernetes.io/projected/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-kube-api-access-mnb5b\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.176842 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-catalog-content\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.177116 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-utilities\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.177261 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-catalog-content\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.194421 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnb5b\" (UniqueName: \"kubernetes.io/projected/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-kube-api-access-mnb5b\") pod \"community-operators-rc44b\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.277533 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.621056 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a"} Feb 19 22:47:59 crc kubenswrapper[4795]: I0219 22:47:59.831949 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rc44b"] Feb 19 22:48:00 crc kubenswrapper[4795]: I0219 22:48:00.636611 4795 generic.go:334] "Generic (PLEG): container finished" podID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerID="d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac" exitCode=0 Feb 19 22:48:00 crc kubenswrapper[4795]: I0219 22:48:00.636710 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc44b" event={"ID":"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53","Type":"ContainerDied","Data":"d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac"} Feb 19 22:48:00 crc kubenswrapper[4795]: I0219 22:48:00.637156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc44b" event={"ID":"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53","Type":"ContainerStarted","Data":"786407bda10fe867dc026d29a3861c792317f5cb6f61ef1d564701d4907a048b"} Feb 19 22:48:00 crc kubenswrapper[4795]: I0219 22:48:00.639840 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:48:01 crc kubenswrapper[4795]: I0219 22:48:01.652351 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc44b" event={"ID":"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53","Type":"ContainerStarted","Data":"4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18"} Feb 19 22:48:02 crc kubenswrapper[4795]: I0219 22:48:02.662401 4795 generic.go:334] "Generic (PLEG): container finished" podID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerID="4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18" exitCode=0 Feb 19 22:48:02 crc kubenswrapper[4795]: I0219 22:48:02.662512 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc44b" event={"ID":"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53","Type":"ContainerDied","Data":"4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18"} Feb 19 22:48:03 crc kubenswrapper[4795]: I0219 22:48:03.672291 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc44b" event={"ID":"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53","Type":"ContainerStarted","Data":"d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb"} Feb 19 22:48:03 crc kubenswrapper[4795]: I0219 22:48:03.689729 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rc44b" podStartSLOduration=3.247860128 podStartE2EDuration="5.689704831s" podCreationTimestamp="2026-02-19 22:47:58 +0000 UTC" firstStartedPulling="2026-02-19 22:48:00.639461789 +0000 UTC m=+4791.831979693" lastFinishedPulling="2026-02-19 22:48:03.081306522 +0000 UTC m=+4794.273824396" observedRunningTime="2026-02-19 22:48:03.689426083 +0000 UTC m=+4794.881943957" watchObservedRunningTime="2026-02-19 22:48:03.689704831 +0000 UTC m=+4794.882222705" Feb 19 22:48:09 crc kubenswrapper[4795]: I0219 22:48:09.278324 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:48:09 crc kubenswrapper[4795]: I0219 22:48:09.278759 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:48:09 crc kubenswrapper[4795]: I0219 22:48:09.320867 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:48:09 crc kubenswrapper[4795]: I0219 22:48:09.785420 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:48:09 crc kubenswrapper[4795]: I0219 22:48:09.829579 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rc44b"] Feb 19 22:48:11 crc kubenswrapper[4795]: I0219 22:48:11.760344 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rc44b" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerName="registry-server" containerID="cri-o://d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb" gracePeriod=2 Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.141782 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.285593 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-catalog-content\") pod \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.285735 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-utilities\") pod \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.285827 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnb5b\" (UniqueName: \"kubernetes.io/projected/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-kube-api-access-mnb5b\") pod \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\" (UID: \"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53\") " Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.286507 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-utilities" (OuterVolumeSpecName: "utilities") pod "b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" (UID: "b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.290522 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-kube-api-access-mnb5b" (OuterVolumeSpecName: "kube-api-access-mnb5b") pod "b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" (UID: "b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53"). InnerVolumeSpecName "kube-api-access-mnb5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.357820 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" (UID: "b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.387699 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.387749 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnb5b\" (UniqueName: \"kubernetes.io/projected/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-kube-api-access-mnb5b\") on node \"crc\" DevicePath \"\"" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.387761 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.774161 4795 generic.go:334] "Generic (PLEG): container finished" podID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerID="d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb" exitCode=0 Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.774269 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc44b" event={"ID":"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53","Type":"ContainerDied","Data":"d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb"} Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.774293 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rc44b" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.774636 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rc44b" event={"ID":"b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53","Type":"ContainerDied","Data":"786407bda10fe867dc026d29a3861c792317f5cb6f61ef1d564701d4907a048b"} Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.774681 4795 scope.go:117] "RemoveContainer" containerID="d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.819913 4795 scope.go:117] "RemoveContainer" containerID="4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.822149 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rc44b"] Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.835103 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rc44b"] Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.839966 4795 scope.go:117] "RemoveContainer" containerID="d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.882081 4795 scope.go:117] "RemoveContainer" containerID="d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb" Feb 19 22:48:12 crc kubenswrapper[4795]: E0219 22:48:12.882964 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb\": container with ID starting with d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb not found: ID does not exist" containerID="d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.883020 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb"} err="failed to get container status \"d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb\": rpc error: code = NotFound desc = could not find container \"d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb\": container with ID starting with d44419965cc6895b59b17ab6350af37d179c131b8265e3abea7073aa58aa25cb not found: ID does not exist" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.883051 4795 scope.go:117] "RemoveContainer" containerID="4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18" Feb 19 22:48:12 crc kubenswrapper[4795]: E0219 22:48:12.883584 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18\": container with ID starting with 4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18 not found: ID does not exist" containerID="4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.883675 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18"} err="failed to get container status \"4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18\": rpc error: code = NotFound desc = could not find container \"4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18\": container with ID starting with 4161ed10a91dc4e5047a51ea5b384cc4c1c8d814f52f47b80d5bb8e0e8702b18 not found: ID does not exist" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.883699 4795 scope.go:117] "RemoveContainer" containerID="d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac" Feb 19 22:48:12 crc kubenswrapper[4795]: E0219 22:48:12.884056 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac\": container with ID starting with d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac not found: ID does not exist" containerID="d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac" Feb 19 22:48:12 crc kubenswrapper[4795]: I0219 22:48:12.884087 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac"} err="failed to get container status \"d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac\": rpc error: code = NotFound desc = could not find container \"d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac\": container with ID starting with d930f6ef1c64c1b075c04f57895f270fa2bc9361bd9e8ee4b21259c337c594ac not found: ID does not exist" Feb 19 22:48:13 crc kubenswrapper[4795]: I0219 22:48:13.531564 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" path="/var/lib/kubelet/pods/b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53/volumes" Feb 19 22:49:58 crc kubenswrapper[4795]: I0219 22:49:58.427817 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:49:58 crc kubenswrapper[4795]: I0219 22:49:58.428267 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.285321 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2l99m"] Feb 19 22:50:14 crc kubenswrapper[4795]: E0219 22:50:14.286365 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerName="extract-utilities" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.286485 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerName="extract-utilities" Feb 19 22:50:14 crc kubenswrapper[4795]: E0219 22:50:14.286525 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerName="registry-server" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.286533 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerName="registry-server" Feb 19 22:50:14 crc kubenswrapper[4795]: E0219 22:50:14.286553 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerName="extract-content" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.286561 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerName="extract-content" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.286750 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0fc9c55-50c4-41f4-939b-3ed2ef8b3b53" containerName="registry-server" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.287848 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.312704 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2l99m"] Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.397428 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-catalog-content\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.397508 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-utilities\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.397719 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8nnm\" (UniqueName: \"kubernetes.io/projected/5964e2d1-6384-4043-9857-a20ea29bd451-kube-api-access-v8nnm\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.499417 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-utilities\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.499474 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8nnm\" (UniqueName: \"kubernetes.io/projected/5964e2d1-6384-4043-9857-a20ea29bd451-kube-api-access-v8nnm\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.499569 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-catalog-content\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.500082 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-catalog-content\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.500086 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-utilities\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.520018 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8nnm\" (UniqueName: \"kubernetes.io/projected/5964e2d1-6384-4043-9857-a20ea29bd451-kube-api-access-v8nnm\") pod \"certified-operators-2l99m\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:14 crc kubenswrapper[4795]: I0219 22:50:14.608493 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:15 crc kubenswrapper[4795]: I0219 22:50:15.044318 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2l99m"] Feb 19 22:50:15 crc kubenswrapper[4795]: I0219 22:50:15.801884 4795 generic.go:334] "Generic (PLEG): container finished" podID="5964e2d1-6384-4043-9857-a20ea29bd451" containerID="683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea" exitCode=0 Feb 19 22:50:15 crc kubenswrapper[4795]: I0219 22:50:15.801954 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l99m" event={"ID":"5964e2d1-6384-4043-9857-a20ea29bd451","Type":"ContainerDied","Data":"683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea"} Feb 19 22:50:15 crc kubenswrapper[4795]: I0219 22:50:15.802541 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l99m" event={"ID":"5964e2d1-6384-4043-9857-a20ea29bd451","Type":"ContainerStarted","Data":"0442e2a13005e0db40a7fe7c99138b63b899c77f70a2bdc495ae396fdb4dc13b"} Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.682126 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fl94b"] Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.687923 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.692221 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fl94b"] Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.751458 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrzlm\" (UniqueName: \"kubernetes.io/projected/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-kube-api-access-jrzlm\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.751531 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-catalog-content\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.751754 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-utilities\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.853642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-utilities\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.853742 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrzlm\" (UniqueName: \"kubernetes.io/projected/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-kube-api-access-jrzlm\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.853774 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-catalog-content\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.854144 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-utilities\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.854290 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-catalog-content\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:16 crc kubenswrapper[4795]: I0219 22:50:16.878508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrzlm\" (UniqueName: \"kubernetes.io/projected/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-kube-api-access-jrzlm\") pod \"redhat-operators-fl94b\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:17 crc kubenswrapper[4795]: I0219 22:50:17.008863 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:17 crc kubenswrapper[4795]: I0219 22:50:17.253539 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fl94b"] Feb 19 22:50:17 crc kubenswrapper[4795]: W0219 22:50:17.298346 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e78bcb8_816d_4f80_9ec1_ef03e589b2b5.slice/crio-94ea73ffdc3628207bbbc3c197d8cb971f78df0836d4ee833099186fe17f50ea WatchSource:0}: Error finding container 94ea73ffdc3628207bbbc3c197d8cb971f78df0836d4ee833099186fe17f50ea: Status 404 returned error can't find the container with id 94ea73ffdc3628207bbbc3c197d8cb971f78df0836d4ee833099186fe17f50ea Feb 19 22:50:17 crc kubenswrapper[4795]: I0219 22:50:17.815238 4795 generic.go:334] "Generic (PLEG): container finished" podID="5964e2d1-6384-4043-9857-a20ea29bd451" containerID="9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135" exitCode=0 Feb 19 22:50:17 crc kubenswrapper[4795]: I0219 22:50:17.815318 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l99m" event={"ID":"5964e2d1-6384-4043-9857-a20ea29bd451","Type":"ContainerDied","Data":"9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135"} Feb 19 22:50:17 crc kubenswrapper[4795]: I0219 22:50:17.817462 4795 generic.go:334] "Generic (PLEG): container finished" podID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerID="1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c" exitCode=0 Feb 19 22:50:17 crc kubenswrapper[4795]: I0219 22:50:17.817511 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fl94b" event={"ID":"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5","Type":"ContainerDied","Data":"1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c"} Feb 19 22:50:17 crc kubenswrapper[4795]: I0219 22:50:17.817538 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fl94b" event={"ID":"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5","Type":"ContainerStarted","Data":"94ea73ffdc3628207bbbc3c197d8cb971f78df0836d4ee833099186fe17f50ea"} Feb 19 22:50:18 crc kubenswrapper[4795]: I0219 22:50:18.825838 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fl94b" event={"ID":"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5","Type":"ContainerStarted","Data":"ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda"} Feb 19 22:50:18 crc kubenswrapper[4795]: I0219 22:50:18.828282 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l99m" event={"ID":"5964e2d1-6384-4043-9857-a20ea29bd451","Type":"ContainerStarted","Data":"1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff"} Feb 19 22:50:18 crc kubenswrapper[4795]: I0219 22:50:18.860541 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2l99m" podStartSLOduration=2.353502198 podStartE2EDuration="4.860521979s" podCreationTimestamp="2026-02-19 22:50:14 +0000 UTC" firstStartedPulling="2026-02-19 22:50:15.803983976 +0000 UTC m=+4926.996501840" lastFinishedPulling="2026-02-19 22:50:18.311003757 +0000 UTC m=+4929.503521621" observedRunningTime="2026-02-19 22:50:18.8584798 +0000 UTC m=+4930.050997684" watchObservedRunningTime="2026-02-19 22:50:18.860521979 +0000 UTC m=+4930.053039833" Feb 19 22:50:19 crc kubenswrapper[4795]: I0219 22:50:19.876261 4795 generic.go:334] "Generic (PLEG): container finished" podID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerID="ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda" exitCode=0 Feb 19 22:50:19 crc kubenswrapper[4795]: I0219 22:50:19.876335 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fl94b" event={"ID":"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5","Type":"ContainerDied","Data":"ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda"} Feb 19 22:50:20 crc kubenswrapper[4795]: I0219 22:50:20.886783 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fl94b" event={"ID":"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5","Type":"ContainerStarted","Data":"a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793"} Feb 19 22:50:20 crc kubenswrapper[4795]: I0219 22:50:20.903758 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fl94b" podStartSLOduration=2.474535875 podStartE2EDuration="4.903737815s" podCreationTimestamp="2026-02-19 22:50:16 +0000 UTC" firstStartedPulling="2026-02-19 22:50:17.819082267 +0000 UTC m=+4929.011600141" lastFinishedPulling="2026-02-19 22:50:20.248284217 +0000 UTC m=+4931.440802081" observedRunningTime="2026-02-19 22:50:20.90112873 +0000 UTC m=+4932.093646624" watchObservedRunningTime="2026-02-19 22:50:20.903737815 +0000 UTC m=+4932.096255689" Feb 19 22:50:24 crc kubenswrapper[4795]: I0219 22:50:24.609548 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:24 crc kubenswrapper[4795]: I0219 22:50:24.610274 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:24 crc kubenswrapper[4795]: I0219 22:50:24.648303 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:24 crc kubenswrapper[4795]: I0219 22:50:24.959869 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:26 crc kubenswrapper[4795]: I0219 22:50:26.075138 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2l99m"] Feb 19 22:50:26 crc kubenswrapper[4795]: I0219 22:50:26.924589 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2l99m" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" containerName="registry-server" containerID="cri-o://1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff" gracePeriod=2 Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.009327 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.009384 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.052066 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.363080 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.412027 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-catalog-content\") pod \"5964e2d1-6384-4043-9857-a20ea29bd451\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.412198 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8nnm\" (UniqueName: \"kubernetes.io/projected/5964e2d1-6384-4043-9857-a20ea29bd451-kube-api-access-v8nnm\") pod \"5964e2d1-6384-4043-9857-a20ea29bd451\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.412306 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-utilities\") pod \"5964e2d1-6384-4043-9857-a20ea29bd451\" (UID: \"5964e2d1-6384-4043-9857-a20ea29bd451\") " Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.413408 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-utilities" (OuterVolumeSpecName: "utilities") pod "5964e2d1-6384-4043-9857-a20ea29bd451" (UID: "5964e2d1-6384-4043-9857-a20ea29bd451"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.419689 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5964e2d1-6384-4043-9857-a20ea29bd451-kube-api-access-v8nnm" (OuterVolumeSpecName: "kube-api-access-v8nnm") pod "5964e2d1-6384-4043-9857-a20ea29bd451" (UID: "5964e2d1-6384-4043-9857-a20ea29bd451"). InnerVolumeSpecName "kube-api-access-v8nnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.459098 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5964e2d1-6384-4043-9857-a20ea29bd451" (UID: "5964e2d1-6384-4043-9857-a20ea29bd451"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.513562 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.513587 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8nnm\" (UniqueName: \"kubernetes.io/projected/5964e2d1-6384-4043-9857-a20ea29bd451-kube-api-access-v8nnm\") on node \"crc\" DevicePath \"\"" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.513600 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5964e2d1-6384-4043-9857-a20ea29bd451-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.936703 4795 generic.go:334] "Generic (PLEG): container finished" podID="5964e2d1-6384-4043-9857-a20ea29bd451" containerID="1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff" exitCode=0 Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.936813 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l99m" event={"ID":"5964e2d1-6384-4043-9857-a20ea29bd451","Type":"ContainerDied","Data":"1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff"} Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.936885 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l99m" event={"ID":"5964e2d1-6384-4043-9857-a20ea29bd451","Type":"ContainerDied","Data":"0442e2a13005e0db40a7fe7c99138b63b899c77f70a2bdc495ae396fdb4dc13b"} Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.936921 4795 scope.go:117] "RemoveContainer" containerID="1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.937685 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l99m" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.966939 4795 scope.go:117] "RemoveContainer" containerID="9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135" Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.971052 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2l99m"] Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.980331 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2l99m"] Feb 19 22:50:27 crc kubenswrapper[4795]: I0219 22:50:27.987433 4795 scope.go:117] "RemoveContainer" containerID="683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea" Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.015045 4795 scope.go:117] "RemoveContainer" containerID="1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff" Feb 19 22:50:28 crc kubenswrapper[4795]: E0219 22:50:28.015539 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff\": container with ID starting with 1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff not found: ID does not exist" containerID="1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff" Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.015581 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff"} err="failed to get container status \"1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff\": rpc error: code = NotFound desc = could not find container \"1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff\": container with ID starting with 1d88201bb74178f76bc909982c078a586396de651fcf6f16cf09294097148bff not found: ID does not exist" Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.015607 4795 scope.go:117] "RemoveContainer" containerID="9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135" Feb 19 22:50:28 crc kubenswrapper[4795]: E0219 22:50:28.016056 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135\": container with ID starting with 9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135 not found: ID does not exist" containerID="9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135" Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.016084 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135"} err="failed to get container status \"9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135\": rpc error: code = NotFound desc = could not find container \"9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135\": container with ID starting with 9740b74f1e123788c75cebc3f7fc043bd4280c71ed6e7b445c5adb12933ad135 not found: ID does not exist" Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.016102 4795 scope.go:117] "RemoveContainer" containerID="683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea" Feb 19 22:50:28 crc kubenswrapper[4795]: E0219 22:50:28.016582 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea\": container with ID starting with 683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea not found: ID does not exist" containerID="683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea" Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.016616 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea"} err="failed to get container status \"683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea\": rpc error: code = NotFound desc = could not find container \"683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea\": container with ID starting with 683a053d5697b6996d3f17d1c67e5c15538cfa4baae0a2d5d0c9dc28a965d9ea not found: ID does not exist" Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.025927 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.427646 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:50:28 crc kubenswrapper[4795]: I0219 22:50:28.428513 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:50:29 crc kubenswrapper[4795]: I0219 22:50:29.473459 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fl94b"] Feb 19 22:50:29 crc kubenswrapper[4795]: I0219 22:50:29.522768 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" path="/var/lib/kubelet/pods/5964e2d1-6384-4043-9857-a20ea29bd451/volumes" Feb 19 22:50:29 crc kubenswrapper[4795]: I0219 22:50:29.951718 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fl94b" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerName="registry-server" containerID="cri-o://a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793" gracePeriod=2 Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.341731 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.355607 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrzlm\" (UniqueName: \"kubernetes.io/projected/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-kube-api-access-jrzlm\") pod \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.355685 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-utilities\") pod \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.355732 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-catalog-content\") pod \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\" (UID: \"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5\") " Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.357959 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-utilities" (OuterVolumeSpecName: "utilities") pod "4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" (UID: "4e78bcb8-816d-4f80-9ec1-ef03e589b2b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.365496 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-kube-api-access-jrzlm" (OuterVolumeSpecName: "kube-api-access-jrzlm") pod "4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" (UID: "4e78bcb8-816d-4f80-9ec1-ef03e589b2b5"). InnerVolumeSpecName "kube-api-access-jrzlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.466916 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrzlm\" (UniqueName: \"kubernetes.io/projected/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-kube-api-access-jrzlm\") on node \"crc\" DevicePath \"\"" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.466959 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.492208 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" (UID: "4e78bcb8-816d-4f80-9ec1-ef03e589b2b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.569712 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.970985 4795 generic.go:334] "Generic (PLEG): container finished" podID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerID="a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793" exitCode=0 Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.971027 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fl94b" event={"ID":"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5","Type":"ContainerDied","Data":"a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793"} Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.971050 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fl94b" event={"ID":"4e78bcb8-816d-4f80-9ec1-ef03e589b2b5","Type":"ContainerDied","Data":"94ea73ffdc3628207bbbc3c197d8cb971f78df0836d4ee833099186fe17f50ea"} Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.971056 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fl94b" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.971065 4795 scope.go:117] "RemoveContainer" containerID="a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793" Feb 19 22:50:30 crc kubenswrapper[4795]: I0219 22:50:30.992600 4795 scope.go:117] "RemoveContainer" containerID="ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda" Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.023256 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fl94b"] Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.030593 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fl94b"] Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.031906 4795 scope.go:117] "RemoveContainer" containerID="1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c" Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.051835 4795 scope.go:117] "RemoveContainer" containerID="a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793" Feb 19 22:50:31 crc kubenswrapper[4795]: E0219 22:50:31.052405 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793\": container with ID starting with a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793 not found: ID does not exist" containerID="a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793" Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.052447 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793"} err="failed to get container status \"a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793\": rpc error: code = NotFound desc = could not find container \"a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793\": container with ID starting with a9cf1eb01a601e9cc23b219d1249ced53774674ce564fc4223b6b0816ceac793 not found: ID does not exist" Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.052472 4795 scope.go:117] "RemoveContainer" containerID="ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda" Feb 19 22:50:31 crc kubenswrapper[4795]: E0219 22:50:31.053138 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda\": container with ID starting with ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda not found: ID does not exist" containerID="ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda" Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.053277 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda"} err="failed to get container status \"ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda\": rpc error: code = NotFound desc = could not find container \"ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda\": container with ID starting with ddfafa3137520be6dd504ceadc8d930e24d9bf3c72eb5bf12cdbe765c8bc7bda not found: ID does not exist" Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.053336 4795 scope.go:117] "RemoveContainer" containerID="1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c" Feb 19 22:50:31 crc kubenswrapper[4795]: E0219 22:50:31.053952 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c\": container with ID starting with 1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c not found: ID does not exist" containerID="1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c" Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.053985 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c"} err="failed to get container status \"1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c\": rpc error: code = NotFound desc = could not find container \"1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c\": container with ID starting with 1cfbf37fcf4c652bf5d28da74cd92b559cc0d200ba2637ed2ce3586d55033a9c not found: ID does not exist" Feb 19 22:50:31 crc kubenswrapper[4795]: I0219 22:50:31.523082 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" path="/var/lib/kubelet/pods/4e78bcb8-816d-4f80-9ec1-ef03e589b2b5/volumes" Feb 19 22:50:58 crc kubenswrapper[4795]: I0219 22:50:58.427476 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:50:58 crc kubenswrapper[4795]: I0219 22:50:58.427999 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:50:58 crc kubenswrapper[4795]: I0219 22:50:58.428067 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:50:58 crc kubenswrapper[4795]: I0219 22:50:58.428775 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:50:58 crc kubenswrapper[4795]: I0219 22:50:58.428841 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" gracePeriod=600 Feb 19 22:50:58 crc kubenswrapper[4795]: E0219 22:50:58.555654 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:50:59 crc kubenswrapper[4795]: I0219 22:50:59.180859 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" exitCode=0 Feb 19 22:50:59 crc kubenswrapper[4795]: I0219 22:50:59.180901 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a"} Feb 19 22:50:59 crc kubenswrapper[4795]: I0219 22:50:59.181001 4795 scope.go:117] "RemoveContainer" containerID="85a7b1a9bb1dc26b66551125ca13bcd3b5f6f4015b61adf891b31e1f3b13c640" Feb 19 22:50:59 crc kubenswrapper[4795]: I0219 22:50:59.181488 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:50:59 crc kubenswrapper[4795]: E0219 22:50:59.181719 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:51:10 crc kubenswrapper[4795]: I0219 22:51:10.511802 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:51:10 crc kubenswrapper[4795]: E0219 22:51:10.512623 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.784507 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Feb 19 22:51:14 crc kubenswrapper[4795]: E0219 22:51:14.785229 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" containerName="registry-server" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.785246 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" containerName="registry-server" Feb 19 22:51:14 crc kubenswrapper[4795]: E0219 22:51:14.785260 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerName="registry-server" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.785268 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerName="registry-server" Feb 19 22:51:14 crc kubenswrapper[4795]: E0219 22:51:14.785296 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" containerName="extract-content" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.785305 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" containerName="extract-content" Feb 19 22:51:14 crc kubenswrapper[4795]: E0219 22:51:14.785318 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerName="extract-utilities" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.785326 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerName="extract-utilities" Feb 19 22:51:14 crc kubenswrapper[4795]: E0219 22:51:14.785342 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" containerName="extract-utilities" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.785353 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" containerName="extract-utilities" Feb 19 22:51:14 crc kubenswrapper[4795]: E0219 22:51:14.785374 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerName="extract-content" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.785384 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerName="extract-content" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.785566 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5964e2d1-6384-4043-9857-a20ea29bd451" containerName="registry-server" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.785592 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e78bcb8-816d-4f80-9ec1-ef03e589b2b5" containerName="registry-server" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.786303 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.788485 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-p5fxh" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.803483 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.886612 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzsrb\" (UniqueName: \"kubernetes.io/projected/4f232979-ab9c-4b59-8ad8-7756367fe0bf-kube-api-access-pzsrb\") pod \"mariadb-copy-data\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") " pod="openstack/mariadb-copy-data" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.886724 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\") pod \"mariadb-copy-data\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") " pod="openstack/mariadb-copy-data" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.987904 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\") pod \"mariadb-copy-data\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") " pod="openstack/mariadb-copy-data" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.988078 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzsrb\" (UniqueName: \"kubernetes.io/projected/4f232979-ab9c-4b59-8ad8-7756367fe0bf-kube-api-access-pzsrb\") pod \"mariadb-copy-data\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") " pod="openstack/mariadb-copy-data" Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.990635 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:14 crc kubenswrapper[4795]: I0219 22:51:14.990681 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\") pod \"mariadb-copy-data\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d0db1225f95d454647216c5717445acc04f6111435881f684855bea7543e0b64/globalmount\"" pod="openstack/mariadb-copy-data" Feb 19 22:51:15 crc kubenswrapper[4795]: I0219 22:51:15.011592 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzsrb\" (UniqueName: \"kubernetes.io/projected/4f232979-ab9c-4b59-8ad8-7756367fe0bf-kube-api-access-pzsrb\") pod \"mariadb-copy-data\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") " pod="openstack/mariadb-copy-data" Feb 19 22:51:15 crc kubenswrapper[4795]: I0219 22:51:15.021257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\") pod \"mariadb-copy-data\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") " pod="openstack/mariadb-copy-data" Feb 19 22:51:15 crc kubenswrapper[4795]: I0219 22:51:15.071043 4795 scope.go:117] "RemoveContainer" containerID="9acf4103dbab2da287716a026f87427a75c0734ef6734fb45eb23664d9f962a3" Feb 19 22:51:15 crc kubenswrapper[4795]: I0219 22:51:15.114419 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 19 22:51:15 crc kubenswrapper[4795]: I0219 22:51:15.601137 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 19 22:51:16 crc kubenswrapper[4795]: I0219 22:51:16.307297 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"4f232979-ab9c-4b59-8ad8-7756367fe0bf","Type":"ContainerStarted","Data":"d99c65ed2cf2832977c62a47557eeea8eec734877d891b4ac4fe2f4a681f7224"} Feb 19 22:51:16 crc kubenswrapper[4795]: I0219 22:51:16.307695 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"4f232979-ab9c-4b59-8ad8-7756367fe0bf","Type":"ContainerStarted","Data":"6654d86759c1aab8319afbb64f442570ab81fe83e940c47c9e22d533fa8c1665"} Feb 19 22:51:16 crc kubenswrapper[4795]: I0219 22:51:16.331337 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.331315563 podStartE2EDuration="3.331315563s" podCreationTimestamp="2026-02-19 22:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:16.328095761 +0000 UTC m=+4987.520613625" watchObservedRunningTime="2026-02-19 22:51:16.331315563 +0000 UTC m=+4987.523833427" Feb 19 22:51:19 crc kubenswrapper[4795]: I0219 22:51:19.441522 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:19 crc kubenswrapper[4795]: I0219 22:51:19.443146 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:51:19 crc kubenswrapper[4795]: I0219 22:51:19.451779 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:19 crc kubenswrapper[4795]: I0219 22:51:19.574992 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz7jf\" (UniqueName: \"kubernetes.io/projected/cc605ab7-0f74-4d42-881d-c486eee6bd72-kube-api-access-rz7jf\") pod \"mariadb-client\" (UID: \"cc605ab7-0f74-4d42-881d-c486eee6bd72\") " pod="openstack/mariadb-client" Feb 19 22:51:19 crc kubenswrapper[4795]: I0219 22:51:19.676267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz7jf\" (UniqueName: \"kubernetes.io/projected/cc605ab7-0f74-4d42-881d-c486eee6bd72-kube-api-access-rz7jf\") pod \"mariadb-client\" (UID: \"cc605ab7-0f74-4d42-881d-c486eee6bd72\") " pod="openstack/mariadb-client" Feb 19 22:51:19 crc kubenswrapper[4795]: I0219 22:51:19.697853 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz7jf\" (UniqueName: \"kubernetes.io/projected/cc605ab7-0f74-4d42-881d-c486eee6bd72-kube-api-access-rz7jf\") pod \"mariadb-client\" (UID: \"cc605ab7-0f74-4d42-881d-c486eee6bd72\") " pod="openstack/mariadb-client" Feb 19 22:51:19 crc kubenswrapper[4795]: I0219 22:51:19.767429 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:51:20 crc kubenswrapper[4795]: I0219 22:51:19.999922 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:20 crc kubenswrapper[4795]: I0219 22:51:20.410574 4795 generic.go:334] "Generic (PLEG): container finished" podID="cc605ab7-0f74-4d42-881d-c486eee6bd72" containerID="f513386e45a86e8c71f0247f1dc15d3e271452b3797cb4850f3ce460e73e544e" exitCode=0 Feb 19 22:51:20 crc kubenswrapper[4795]: I0219 22:51:20.410653 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"cc605ab7-0f74-4d42-881d-c486eee6bd72","Type":"ContainerDied","Data":"f513386e45a86e8c71f0247f1dc15d3e271452b3797cb4850f3ce460e73e544e"} Feb 19 22:51:20 crc kubenswrapper[4795]: I0219 22:51:20.410677 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"cc605ab7-0f74-4d42-881d-c486eee6bd72","Type":"ContainerStarted","Data":"a90182db1e02dd2460da3caae1b4690b2f8e06fbecf0eed6750807d1237bf802"} Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.708111 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.745826 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_cc605ab7-0f74-4d42-881d-c486eee6bd72/mariadb-client/0.log" Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.771634 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.776581 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.809800 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz7jf\" (UniqueName: \"kubernetes.io/projected/cc605ab7-0f74-4d42-881d-c486eee6bd72-kube-api-access-rz7jf\") pod \"cc605ab7-0f74-4d42-881d-c486eee6bd72\" (UID: \"cc605ab7-0f74-4d42-881d-c486eee6bd72\") " Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.815504 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc605ab7-0f74-4d42-881d-c486eee6bd72-kube-api-access-rz7jf" (OuterVolumeSpecName: "kube-api-access-rz7jf") pod "cc605ab7-0f74-4d42-881d-c486eee6bd72" (UID: "cc605ab7-0f74-4d42-881d-c486eee6bd72"). InnerVolumeSpecName "kube-api-access-rz7jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.911907 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz7jf\" (UniqueName: \"kubernetes.io/projected/cc605ab7-0f74-4d42-881d-c486eee6bd72-kube-api-access-rz7jf\") on node \"crc\" DevicePath \"\"" Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.930656 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:21 crc kubenswrapper[4795]: E0219 22:51:21.930948 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc605ab7-0f74-4d42-881d-c486eee6bd72" containerName="mariadb-client" Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.930965 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc605ab7-0f74-4d42-881d-c486eee6bd72" containerName="mariadb-client" Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.931126 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc605ab7-0f74-4d42-881d-c486eee6bd72" containerName="mariadb-client" Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.932798 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:51:21 crc kubenswrapper[4795]: I0219 22:51:21.942679 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.013565 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8zf8\" (UniqueName: \"kubernetes.io/projected/8d6f031b-713a-4c22-8017-5a615e34004f-kube-api-access-l8zf8\") pod \"mariadb-client\" (UID: \"8d6f031b-713a-4c22-8017-5a615e34004f\") " pod="openstack/mariadb-client" Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.116120 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8zf8\" (UniqueName: \"kubernetes.io/projected/8d6f031b-713a-4c22-8017-5a615e34004f-kube-api-access-l8zf8\") pod \"mariadb-client\" (UID: \"8d6f031b-713a-4c22-8017-5a615e34004f\") " pod="openstack/mariadb-client" Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.134307 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8zf8\" (UniqueName: \"kubernetes.io/projected/8d6f031b-713a-4c22-8017-5a615e34004f-kube-api-access-l8zf8\") pod \"mariadb-client\" (UID: \"8d6f031b-713a-4c22-8017-5a615e34004f\") " pod="openstack/mariadb-client" Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.253905 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.432707 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a90182db1e02dd2460da3caae1b4690b2f8e06fbecf0eed6750807d1237bf802" Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.432909 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.452095 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="cc605ab7-0f74-4d42-881d-c486eee6bd72" podUID="8d6f031b-713a-4c22-8017-5a615e34004f" Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.497944 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:22 crc kubenswrapper[4795]: W0219 22:51:22.498604 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d6f031b_713a_4c22_8017_5a615e34004f.slice/crio-7e704049a2bbd5adf282108cd27b5f71465e6a42c51ed4657129b08be0639fa4 WatchSource:0}: Error finding container 7e704049a2bbd5adf282108cd27b5f71465e6a42c51ed4657129b08be0639fa4: Status 404 returned error can't find the container with id 7e704049a2bbd5adf282108cd27b5f71465e6a42c51ed4657129b08be0639fa4 Feb 19 22:51:22 crc kubenswrapper[4795]: I0219 22:51:22.511425 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:51:22 crc kubenswrapper[4795]: E0219 22:51:22.511683 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:51:23 crc kubenswrapper[4795]: I0219 22:51:23.440092 4795 generic.go:334] "Generic (PLEG): container finished" podID="8d6f031b-713a-4c22-8017-5a615e34004f" containerID="ddf33d9929aa340d91607be45827bb09c61b346f1c01e148af46a4a7d07c2f45" exitCode=0 Feb 19 22:51:23 crc kubenswrapper[4795]: I0219 22:51:23.440186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"8d6f031b-713a-4c22-8017-5a615e34004f","Type":"ContainerDied","Data":"ddf33d9929aa340d91607be45827bb09c61b346f1c01e148af46a4a7d07c2f45"} Feb 19 22:51:23 crc kubenswrapper[4795]: I0219 22:51:23.440393 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"8d6f031b-713a-4c22-8017-5a615e34004f","Type":"ContainerStarted","Data":"7e704049a2bbd5adf282108cd27b5f71465e6a42c51ed4657129b08be0639fa4"} Feb 19 22:51:23 crc kubenswrapper[4795]: I0219 22:51:23.525687 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc605ab7-0f74-4d42-881d-c486eee6bd72" path="/var/lib/kubelet/pods/cc605ab7-0f74-4d42-881d-c486eee6bd72/volumes" Feb 19 22:51:24 crc kubenswrapper[4795]: I0219 22:51:24.737029 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:51:24 crc kubenswrapper[4795]: I0219 22:51:24.758556 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_8d6f031b-713a-4c22-8017-5a615e34004f/mariadb-client/0.log" Feb 19 22:51:24 crc kubenswrapper[4795]: I0219 22:51:24.786821 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:24 crc kubenswrapper[4795]: I0219 22:51:24.791106 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 19 22:51:24 crc kubenswrapper[4795]: I0219 22:51:24.855872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8zf8\" (UniqueName: \"kubernetes.io/projected/8d6f031b-713a-4c22-8017-5a615e34004f-kube-api-access-l8zf8\") pod \"8d6f031b-713a-4c22-8017-5a615e34004f\" (UID: \"8d6f031b-713a-4c22-8017-5a615e34004f\") " Feb 19 22:51:24 crc kubenswrapper[4795]: I0219 22:51:24.862347 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d6f031b-713a-4c22-8017-5a615e34004f-kube-api-access-l8zf8" (OuterVolumeSpecName: "kube-api-access-l8zf8") pod "8d6f031b-713a-4c22-8017-5a615e34004f" (UID: "8d6f031b-713a-4c22-8017-5a615e34004f"). InnerVolumeSpecName "kube-api-access-l8zf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:51:24 crc kubenswrapper[4795]: I0219 22:51:24.957668 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8zf8\" (UniqueName: \"kubernetes.io/projected/8d6f031b-713a-4c22-8017-5a615e34004f-kube-api-access-l8zf8\") on node \"crc\" DevicePath \"\"" Feb 19 22:51:25 crc kubenswrapper[4795]: I0219 22:51:25.455259 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e704049a2bbd5adf282108cd27b5f71465e6a42c51ed4657129b08be0639fa4" Feb 19 22:51:25 crc kubenswrapper[4795]: I0219 22:51:25.455708 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 19 22:51:25 crc kubenswrapper[4795]: I0219 22:51:25.520186 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d6f031b-713a-4c22-8017-5a615e34004f" path="/var/lib/kubelet/pods/8d6f031b-713a-4c22-8017-5a615e34004f/volumes" Feb 19 22:51:34 crc kubenswrapper[4795]: I0219 22:51:34.511430 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:51:34 crc kubenswrapper[4795]: E0219 22:51:34.512226 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:51:47 crc kubenswrapper[4795]: I0219 22:51:47.512054 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:51:47 crc kubenswrapper[4795]: E0219 22:51:47.512872 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.553588 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 22:51:54 crc kubenswrapper[4795]: E0219 22:51:54.554468 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6f031b-713a-4c22-8017-5a615e34004f" containerName="mariadb-client" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.554485 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6f031b-713a-4c22-8017-5a615e34004f" containerName="mariadb-client" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.554674 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d6f031b-713a-4c22-8017-5a615e34004f" containerName="mariadb-client" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.555543 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.558497 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.559232 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.559475 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-2mgdw" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.586975 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.607841 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.609595 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.612649 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.614990 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.620731 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/924d2a8a-2ae7-417a-9770-054662474286-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.621094 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/924d2a8a-2ae7-417a-9770-054662474286-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.621136 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924d2a8a-2ae7-417a-9770-054662474286-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.621200 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/924d2a8a-2ae7-417a-9770-054662474286-config\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.621413 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlm54\" (UniqueName: \"kubernetes.io/projected/924d2a8a-2ae7-417a-9770-054662474286-kube-api-access-qlm54\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.621486 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-552c33e5-c2b0-4416-b2bd-0144a712b3cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-552c33e5-c2b0-4416-b2bd-0144a712b3cc\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.630055 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.637966 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723369 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723430 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snclf\" (UniqueName: \"kubernetes.io/projected/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-kube-api-access-snclf\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723450 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723653 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsfq9\" (UniqueName: \"kubernetes.io/projected/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-kube-api-access-tsfq9\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723766 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723876 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/924d2a8a-2ae7-417a-9770-054662474286-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924d2a8a-2ae7-417a-9770-054662474286-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723948 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/924d2a8a-2ae7-417a-9770-054662474286-config\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723975 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-config\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.723996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.724086 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlm54\" (UniqueName: \"kubernetes.io/projected/924d2a8a-2ae7-417a-9770-054662474286-kube-api-access-qlm54\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.724123 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-552c33e5-c2b0-4416-b2bd-0144a712b3cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-552c33e5-c2b0-4416-b2bd-0144a712b3cc\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.724158 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-841e812f-b743-45d7-a4f7-61e300ae1598\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-841e812f-b743-45d7-a4f7-61e300ae1598\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.724254 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.724346 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-46120ff0-7821-4a9b-85e7-062ad17a54a1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46120ff0-7821-4a9b-85e7-062ad17a54a1\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.724387 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/924d2a8a-2ae7-417a-9770-054662474286-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.724418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-config\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.724457 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/924d2a8a-2ae7-417a-9770-054662474286-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.725128 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/924d2a8a-2ae7-417a-9770-054662474286-config\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.725588 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/924d2a8a-2ae7-417a-9770-054662474286-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.731006 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.731270 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-552c33e5-c2b0-4416-b2bd-0144a712b3cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-552c33e5-c2b0-4416-b2bd-0144a712b3cc\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3b528f5553046ea28336dd5dedcb616848a6b59d36a3821c411aaad7c89c5a53/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.731565 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/924d2a8a-2ae7-417a-9770-054662474286-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.741592 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.742979 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.749992 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-znw7z" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.750229 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.750349 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.757931 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.765126 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlm54\" (UniqueName: \"kubernetes.io/projected/924d2a8a-2ae7-417a-9770-054662474286-kube-api-access-qlm54\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.765974 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.767260 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.772295 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.774616 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.781422 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.793116 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-552c33e5-c2b0-4416-b2bd-0144a712b3cc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-552c33e5-c2b0-4416-b2bd-0144a712b3cc\") pod \"ovsdbserver-sb-0\" (UID: \"924d2a8a-2ae7-417a-9770-054662474286\") " pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.810556 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826204 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826299 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/188a11e4-50de-4672-baaf-89a3a512cd0c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826350 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7lh6\" (UniqueName: \"kubernetes.io/projected/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-kube-api-access-z7lh6\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tpf5\" (UniqueName: \"kubernetes.io/projected/f814768e-2961-4d2a-ba3b-615dea717cf8-kube-api-access-9tpf5\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826398 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-841e812f-b743-45d7-a4f7-61e300ae1598\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-841e812f-b743-45d7-a4f7-61e300ae1598\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826414 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1bd16f90-4266-4719-9291-209b6cf6174c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1bd16f90-4266-4719-9291-209b6cf6174c\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826434 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826465 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826494 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-46120ff0-7821-4a9b-85e7-062ad17a54a1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46120ff0-7821-4a9b-85e7-062ad17a54a1\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826515 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f814768e-2961-4d2a-ba3b-615dea717cf8-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826539 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-config\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826561 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/188a11e4-50de-4672-baaf-89a3a512cd0c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826580 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-08539c3c-3934-44e9-9aad-66b43491c570\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08539c3c-3934-44e9-9aad-66b43491c570\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826813 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826833 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snclf\" (UniqueName: \"kubernetes.io/projected/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-kube-api-access-snclf\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826852 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826877 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b6d12d84-b782-4a08-b2af-133e42b5db7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6d12d84-b782-4a08-b2af-133e42b5db7b\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-config\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826910 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188a11e4-50de-4672-baaf-89a3a512cd0c-config\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826930 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsfq9\" (UniqueName: \"kubernetes.io/projected/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-kube-api-access-tsfq9\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826947 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/188a11e4-50de-4672-baaf-89a3a512cd0c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.826970 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vpjf\" (UniqueName: \"kubernetes.io/projected/188a11e4-50de-4672-baaf-89a3a512cd0c-kube-api-access-7vpjf\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827029 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827069 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827088 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f814768e-2961-4d2a-ba3b-615dea717cf8-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827107 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f814768e-2961-4d2a-ba3b-615dea717cf8-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827129 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f814768e-2961-4d2a-ba3b-615dea717cf8-config\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827151 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-config\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827182 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-config\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827601 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.827655 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.828024 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.828188 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.828544 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-config\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.829883 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.829910 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-841e812f-b743-45d7-a4f7-61e300ae1598\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-841e812f-b743-45d7-a4f7-61e300ae1598\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/aca27e3cff8edd54249a833e181f411a80bf436270cd04eb0a8f9162f59ecfd5/globalmount\"" pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.830343 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.830368 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-46120ff0-7821-4a9b-85e7-062ad17a54a1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46120ff0-7821-4a9b-85e7-062ad17a54a1\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0a2d8661200c1d07fe2ccaa0ae4e4d6b97a3f8f36bbb172d4a279baae4787109/globalmount\"" pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.831219 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.833954 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.848397 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsfq9\" (UniqueName: \"kubernetes.io/projected/9c8cf9f5-7499-4c52-9710-91b96d49b0fc-kube-api-access-tsfq9\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.848613 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snclf\" (UniqueName: \"kubernetes.io/projected/9c5bddd7-705d-41b3-ad43-1889c6c34ab0-kube-api-access-snclf\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.876551 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-841e812f-b743-45d7-a4f7-61e300ae1598\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-841e812f-b743-45d7-a4f7-61e300ae1598\") pod \"ovsdbserver-sb-1\" (UID: \"9c5bddd7-705d-41b3-ad43-1889c6c34ab0\") " pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.878110 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.880398 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-46120ff0-7821-4a9b-85e7-062ad17a54a1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46120ff0-7821-4a9b-85e7-062ad17a54a1\") pod \"ovsdbserver-sb-2\" (UID: \"9c8cf9f5-7499-4c52-9710-91b96d49b0fc\") " pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929242 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929350 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f814768e-2961-4d2a-ba3b-615dea717cf8-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929436 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/188a11e4-50de-4672-baaf-89a3a512cd0c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-08539c3c-3934-44e9-9aad-66b43491c570\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08539c3c-3934-44e9-9aad-66b43491c570\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929537 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b6d12d84-b782-4a08-b2af-133e42b5db7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6d12d84-b782-4a08-b2af-133e42b5db7b\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929558 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-config\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188a11e4-50de-4672-baaf-89a3a512cd0c-config\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929610 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/188a11e4-50de-4672-baaf-89a3a512cd0c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929638 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vpjf\" (UniqueName: \"kubernetes.io/projected/188a11e4-50de-4672-baaf-89a3a512cd0c-kube-api-access-7vpjf\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929672 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929691 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f814768e-2961-4d2a-ba3b-615dea717cf8-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f814768e-2961-4d2a-ba3b-615dea717cf8-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929738 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f814768e-2961-4d2a-ba3b-615dea717cf8-config\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929783 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929806 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/188a11e4-50de-4672-baaf-89a3a512cd0c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929837 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7lh6\" (UniqueName: \"kubernetes.io/projected/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-kube-api-access-z7lh6\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tpf5\" (UniqueName: \"kubernetes.io/projected/f814768e-2961-4d2a-ba3b-615dea717cf8-kube-api-access-9tpf5\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.929883 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1bd16f90-4266-4719-9291-209b6cf6174c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1bd16f90-4266-4719-9291-209b6cf6174c\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.931430 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/188a11e4-50de-4672-baaf-89a3a512cd0c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.931830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.932140 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f814768e-2961-4d2a-ba3b-615dea717cf8-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.934272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f814768e-2961-4d2a-ba3b-615dea717cf8-config\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.934351 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.934427 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-08539c3c-3934-44e9-9aad-66b43491c570\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08539c3c-3934-44e9-9aad-66b43491c570\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c5557edd1aaf2a0c264069831ad9260b3ced248ed61a9854588c21aa3523c18c/globalmount\"" pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.935461 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/188a11e4-50de-4672-baaf-89a3a512cd0c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.934748 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/188a11e4-50de-4672-baaf-89a3a512cd0c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.934530 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.935247 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/188a11e4-50de-4672-baaf-89a3a512cd0c-config\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.935544 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1bd16f90-4266-4719-9291-209b6cf6174c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1bd16f90-4266-4719-9291-209b6cf6174c\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/27a567fbfad556db7b211a2e7847fe547895c6e57e901962a8919ac461884ee0/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.935348 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-config\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.935152 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.935448 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.935418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f814768e-2961-4d2a-ba3b-615dea717cf8-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.935620 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b6d12d84-b782-4a08-b2af-133e42b5db7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6d12d84-b782-4a08-b2af-133e42b5db7b\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/db1dd82f0e1b9f62d7344caa6e7654af05c11eaf0ccaf04f246ecd80c7b35186/globalmount\"" pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.939750 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.940641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f814768e-2961-4d2a-ba3b-615dea717cf8-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.948434 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tpf5\" (UniqueName: \"kubernetes.io/projected/f814768e-2961-4d2a-ba3b-615dea717cf8-kube-api-access-9tpf5\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.950759 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.954264 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.955531 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7lh6\" (UniqueName: \"kubernetes.io/projected/9b32c19b-2b8b-4587-9327-1ddf5b074ad6-kube-api-access-z7lh6\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.964517 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b6d12d84-b782-4a08-b2af-133e42b5db7b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6d12d84-b782-4a08-b2af-133e42b5db7b\") pod \"ovsdbserver-nb-1\" (UID: \"f814768e-2961-4d2a-ba3b-615dea717cf8\") " pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.967580 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vpjf\" (UniqueName: \"kubernetes.io/projected/188a11e4-50de-4672-baaf-89a3a512cd0c-kube-api-access-7vpjf\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.976254 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-08539c3c-3934-44e9-9aad-66b43491c570\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-08539c3c-3934-44e9-9aad-66b43491c570\") pod \"ovsdbserver-nb-2\" (UID: \"188a11e4-50de-4672-baaf-89a3a512cd0c\") " pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:54 crc kubenswrapper[4795]: I0219 22:51:54.985770 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1bd16f90-4266-4719-9291-209b6cf6174c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1bd16f90-4266-4719-9291-209b6cf6174c\") pod \"ovsdbserver-nb-0\" (UID: \"9b32c19b-2b8b-4587-9327-1ddf5b074ad6\") " pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.200675 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.240085 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.253907 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.415725 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 22:51:55 crc kubenswrapper[4795]: W0219 22:51:55.519633 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c8cf9f5_7499_4c52_9710_91b96d49b0fc.slice/crio-1ac92e99539ac69cecf201912fafb10d05998d6510947a4be6459d01671fc203 WatchSource:0}: Error finding container 1ac92e99539ac69cecf201912fafb10d05998d6510947a4be6459d01671fc203: Status 404 returned error can't find the container with id 1ac92e99539ac69cecf201912fafb10d05998d6510947a4be6459d01671fc203 Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.525510 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.674652 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"924d2a8a-2ae7-417a-9770-054662474286","Type":"ContainerStarted","Data":"49c0a55fa410598d38997cfac435ed117fc4d698fba1d797eddfd1fe9b55a871"} Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.674690 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"924d2a8a-2ae7-417a-9770-054662474286","Type":"ContainerStarted","Data":"32337d3621741d173ad77da972c37a7e044d3759fd535ec343e10bfb072597aa"} Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.676893 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"9c8cf9f5-7499-4c52-9710-91b96d49b0fc","Type":"ContainerStarted","Data":"652d5844931377f852635eef72cbbec67148a46211ccc470935614b10dad6194"} Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.676916 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"9c8cf9f5-7499-4c52-9710-91b96d49b0fc","Type":"ContainerStarted","Data":"1ac92e99539ac69cecf201912fafb10d05998d6510947a4be6459d01671fc203"} Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.703626 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 22:51:55 crc kubenswrapper[4795]: W0219 22:51:55.718069 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b32c19b_2b8b_4587_9327_1ddf5b074ad6.slice/crio-e7606ae6f09fbf2b028db75b20c79529c331868bb30de53bddd69aca62425a7a WatchSource:0}: Error finding container e7606ae6f09fbf2b028db75b20c79529c331868bb30de53bddd69aca62425a7a: Status 404 returned error can't find the container with id e7606ae6f09fbf2b028db75b20c79529c331868bb30de53bddd69aca62425a7a Feb 19 22:51:55 crc kubenswrapper[4795]: I0219 22:51:55.817152 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 19 22:51:55 crc kubenswrapper[4795]: W0219 22:51:55.821655 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf814768e_2961_4d2a_ba3b_615dea717cf8.slice/crio-6c53eb8c45bad827e2360bebeef78494ad130dc2748fdc0944c22d4d4a99ae9a WatchSource:0}: Error finding container 6c53eb8c45bad827e2360bebeef78494ad130dc2748fdc0944c22d4d4a99ae9a: Status 404 returned error can't find the container with id 6c53eb8c45bad827e2360bebeef78494ad130dc2748fdc0944c22d4d4a99ae9a Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.114624 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.683989 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"f814768e-2961-4d2a-ba3b-615dea717cf8","Type":"ContainerStarted","Data":"7155fc037f2c62975ea85f65ed5cdd4da97d391e6238a7588c35c446cd943e68"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.685239 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"f814768e-2961-4d2a-ba3b-615dea717cf8","Type":"ContainerStarted","Data":"c3294df24bbdd5efeecb8ed96f4b3ec477f55238605ce6a7ba26f396bcf9316b"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.685266 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"f814768e-2961-4d2a-ba3b-615dea717cf8","Type":"ContainerStarted","Data":"6c53eb8c45bad827e2360bebeef78494ad130dc2748fdc0944c22d4d4a99ae9a"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.688671 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9b32c19b-2b8b-4587-9327-1ddf5b074ad6","Type":"ContainerStarted","Data":"faae9ef344ebf349c5656c03b60dc6ec8b3197801d99b54016f0ab282a2c1d2b"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.688697 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9b32c19b-2b8b-4587-9327-1ddf5b074ad6","Type":"ContainerStarted","Data":"0bcc6a4699eaa7b1e559d8fb7153887a76810c14f58965864516267036824580"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.688706 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9b32c19b-2b8b-4587-9327-1ddf5b074ad6","Type":"ContainerStarted","Data":"e7606ae6f09fbf2b028db75b20c79529c331868bb30de53bddd69aca62425a7a"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.690083 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"9c8cf9f5-7499-4c52-9710-91b96d49b0fc","Type":"ContainerStarted","Data":"4130d4c8f1430c2f2c5659f4afff52f3b06db37d844f72cd7d8ec089d6bd8c10"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.692009 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"9c5bddd7-705d-41b3-ad43-1889c6c34ab0","Type":"ContainerStarted","Data":"b3658c2e1080f8ddd0daa19ba508bd8f8781a70157518d68b05c21f0e2ecafc5"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.692032 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"9c5bddd7-705d-41b3-ad43-1889c6c34ab0","Type":"ContainerStarted","Data":"74bba6c69905c7ad27b9f920647247cdab0473ebb5cb41441b882379756a3caf"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.692042 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"9c5bddd7-705d-41b3-ad43-1889c6c34ab0","Type":"ContainerStarted","Data":"776f674a5b7e650c62f1c4ec9ad34086b4a66fb8c7326722c072767e6b74bcf1"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.693250 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"924d2a8a-2ae7-417a-9770-054662474286","Type":"ContainerStarted","Data":"26bd76b1fc96d5f7750289149cb37d6cec20052065f5959d04c801e28f6bb382"} Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.732669 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.732654952 podStartE2EDuration="3.732654952s" podCreationTimestamp="2026-02-19 22:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:56.731804818 +0000 UTC m=+5027.924322682" watchObservedRunningTime="2026-02-19 22:51:56.732654952 +0000 UTC m=+5027.925172816" Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.735867 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.735859104 podStartE2EDuration="3.735859104s" podCreationTimestamp="2026-02-19 22:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:56.708932102 +0000 UTC m=+5027.901449976" watchObservedRunningTime="2026-02-19 22:51:56.735859104 +0000 UTC m=+5027.928376968" Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.761030 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.7610094849999998 podStartE2EDuration="3.761009485s" podCreationTimestamp="2026-02-19 22:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:56.756807194 +0000 UTC m=+5027.949325068" watchObservedRunningTime="2026-02-19 22:51:56.761009485 +0000 UTC m=+5027.953527349" Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.789510 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.789486051 podStartE2EDuration="3.789486051s" podCreationTimestamp="2026-02-19 22:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:56.787723191 +0000 UTC m=+5027.980241055" watchObservedRunningTime="2026-02-19 22:51:56.789486051 +0000 UTC m=+5027.982003945" Feb 19 22:51:56 crc kubenswrapper[4795]: I0219 22:51:56.815220 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 19 22:51:56 crc kubenswrapper[4795]: W0219 22:51:56.821836 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod188a11e4_50de_4672_baaf_89a3a512cd0c.slice/crio-6174c539b4290276b5a7764ab336641ae36237e6572ffb7135237fb585aa9422 WatchSource:0}: Error finding container 6174c539b4290276b5a7764ab336641ae36237e6572ffb7135237fb585aa9422: Status 404 returned error can't find the container with id 6174c539b4290276b5a7764ab336641ae36237e6572ffb7135237fb585aa9422 Feb 19 22:51:57 crc kubenswrapper[4795]: I0219 22:51:57.706427 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"188a11e4-50de-4672-baaf-89a3a512cd0c","Type":"ContainerStarted","Data":"16143a7141f0493d70240cd1893314445bd55ba212e2b5433b67e9bfe0a968f8"} Feb 19 22:51:57 crc kubenswrapper[4795]: I0219 22:51:57.706747 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"188a11e4-50de-4672-baaf-89a3a512cd0c","Type":"ContainerStarted","Data":"5ed80e8f09df17ce3b9626981e388bc78869cdc68ac78e8ab98739f8160677c0"} Feb 19 22:51:57 crc kubenswrapper[4795]: I0219 22:51:57.706758 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"188a11e4-50de-4672-baaf-89a3a512cd0c","Type":"ContainerStarted","Data":"6174c539b4290276b5a7764ab336641ae36237e6572ffb7135237fb585aa9422"} Feb 19 22:51:57 crc kubenswrapper[4795]: I0219 22:51:57.729327 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.72930162 podStartE2EDuration="4.72930162s" podCreationTimestamp="2026-02-19 22:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:56.820985484 +0000 UTC m=+5028.013503358" watchObservedRunningTime="2026-02-19 22:51:57.72930162 +0000 UTC m=+5028.921819524" Feb 19 22:51:57 crc kubenswrapper[4795]: I0219 22:51:57.736067 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.736050314 podStartE2EDuration="4.736050314s" podCreationTimestamp="2026-02-19 22:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:51:57.7272138 +0000 UTC m=+5028.919731704" watchObservedRunningTime="2026-02-19 22:51:57.736050314 +0000 UTC m=+5028.928568218" Feb 19 22:51:57 crc kubenswrapper[4795]: I0219 22:51:57.879147 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:57 crc kubenswrapper[4795]: I0219 22:51:57.941057 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:57 crc kubenswrapper[4795]: I0219 22:51:57.951371 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Feb 19 22:51:58 crc kubenswrapper[4795]: I0219 22:51:58.201972 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:58 crc kubenswrapper[4795]: I0219 22:51:58.240427 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:58 crc kubenswrapper[4795]: I0219 22:51:58.254417 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Feb 19 22:51:58 crc kubenswrapper[4795]: I0219 22:51:58.270885 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:58 crc kubenswrapper[4795]: I0219 22:51:58.315603 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:58 crc kubenswrapper[4795]: I0219 22:51:58.716383 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Feb 19 22:51:58 crc kubenswrapper[4795]: I0219 22:51:58.716922 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 19 22:51:59 crc kubenswrapper[4795]: I0219 22:51:59.879262 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 19 22:51:59 crc kubenswrapper[4795]: I0219 22:51:59.940778 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Feb 19 22:51:59 crc kubenswrapper[4795]: I0219 22:51:59.951472 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.254020 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.259985 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.280735 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.457682 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56f9dfcfb9-bslgg"] Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.459027 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.462377 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.483858 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56f9dfcfb9-bslgg"] Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.511937 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:52:00 crc kubenswrapper[4795]: E0219 22:52:00.512183 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.532347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-config\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.532398 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-ovsdbserver-nb\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.532525 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqqtg\" (UniqueName: \"kubernetes.io/projected/71730cb0-0d62-496c-b20a-590bc258489b-kube-api-access-pqqtg\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.532677 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-dns-svc\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.633778 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-config\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.633827 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-ovsdbserver-nb\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.633860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqqtg\" (UniqueName: \"kubernetes.io/projected/71730cb0-0d62-496c-b20a-590bc258489b-kube-api-access-pqqtg\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.633946 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-dns-svc\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.634773 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-ovsdbserver-nb\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.634794 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-config\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.635092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-dns-svc\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.653596 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqqtg\" (UniqueName: \"kubernetes.io/projected/71730cb0-0d62-496c-b20a-590bc258489b-kube-api-access-pqqtg\") pod \"dnsmasq-dns-56f9dfcfb9-bslgg\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.790881 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.924226 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 19 22:52:00 crc kubenswrapper[4795]: I0219 22:52:00.969357 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.009412 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.009759 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.063690 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.082468 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.136551 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56f9dfcfb9-bslgg"] Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.168653 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dc9b786df-rmprl"] Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.170006 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.175998 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.191089 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc9b786df-rmprl"] Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.230885 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56f9dfcfb9-bslgg"] Feb 19 22:52:01 crc kubenswrapper[4795]: W0219 22:52:01.236505 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71730cb0_0d62_496c_b20a_590bc258489b.slice/crio-514fd9cf80faa56b01c5fba3e6443cc446339d2131fc32a8758f5440873bb6c7 WatchSource:0}: Error finding container 514fd9cf80faa56b01c5fba3e6443cc446339d2131fc32a8758f5440873bb6c7: Status 404 returned error can't find the container with id 514fd9cf80faa56b01c5fba3e6443cc446339d2131fc32a8758f5440873bb6c7 Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.250219 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.250409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-dns-svc\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.250482 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-config\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.250561 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.250678 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrnhl\" (UniqueName: \"kubernetes.io/projected/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-kube-api-access-rrnhl\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.296628 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.352306 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrnhl\" (UniqueName: \"kubernetes.io/projected/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-kube-api-access-rrnhl\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.352408 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.352439 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-dns-svc\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.352460 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-config\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.352481 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.353211 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.353821 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.353833 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-dns-svc\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.353995 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-config\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.370950 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrnhl\" (UniqueName: \"kubernetes.io/projected/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-kube-api-access-rrnhl\") pod \"dnsmasq-dns-6dc9b786df-rmprl\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.495079 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.733662 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc9b786df-rmprl"] Feb 19 22:52:01 crc kubenswrapper[4795]: W0219 22:52:01.741077 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee8a52c2_f6ad_4b2e_a092_9393dac0f15a.slice/crio-759043739bccdb4e9d8810a99089303a4dcf977d51cfcc31ca61096dfaef3dbf WatchSource:0}: Error finding container 759043739bccdb4e9d8810a99089303a4dcf977d51cfcc31ca61096dfaef3dbf: Status 404 returned error can't find the container with id 759043739bccdb4e9d8810a99089303a4dcf977d51cfcc31ca61096dfaef3dbf Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.741530 4795 generic.go:334] "Generic (PLEG): container finished" podID="71730cb0-0d62-496c-b20a-590bc258489b" containerID="95a83942347a601497cb76f3b33f1eec961796f6d98935a98ad389d2a4ed0165" exitCode=0 Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.741629 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" event={"ID":"71730cb0-0d62-496c-b20a-590bc258489b","Type":"ContainerDied","Data":"95a83942347a601497cb76f3b33f1eec961796f6d98935a98ad389d2a4ed0165"} Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.741675 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" event={"ID":"71730cb0-0d62-496c-b20a-590bc258489b","Type":"ContainerStarted","Data":"514fd9cf80faa56b01c5fba3e6443cc446339d2131fc32a8758f5440873bb6c7"} Feb 19 22:52:01 crc kubenswrapper[4795]: I0219 22:52:01.817470 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.103597 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.177743 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqqtg\" (UniqueName: \"kubernetes.io/projected/71730cb0-0d62-496c-b20a-590bc258489b-kube-api-access-pqqtg\") pod \"71730cb0-0d62-496c-b20a-590bc258489b\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.177810 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-dns-svc\") pod \"71730cb0-0d62-496c-b20a-590bc258489b\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.177934 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-config\") pod \"71730cb0-0d62-496c-b20a-590bc258489b\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.177974 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-ovsdbserver-nb\") pod \"71730cb0-0d62-496c-b20a-590bc258489b\" (UID: \"71730cb0-0d62-496c-b20a-590bc258489b\") " Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.182268 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71730cb0-0d62-496c-b20a-590bc258489b-kube-api-access-pqqtg" (OuterVolumeSpecName: "kube-api-access-pqqtg") pod "71730cb0-0d62-496c-b20a-590bc258489b" (UID: "71730cb0-0d62-496c-b20a-590bc258489b"). InnerVolumeSpecName "kube-api-access-pqqtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.195623 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "71730cb0-0d62-496c-b20a-590bc258489b" (UID: "71730cb0-0d62-496c-b20a-590bc258489b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.197351 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "71730cb0-0d62-496c-b20a-590bc258489b" (UID: "71730cb0-0d62-496c-b20a-590bc258489b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.200893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-config" (OuterVolumeSpecName: "config") pod "71730cb0-0d62-496c-b20a-590bc258489b" (UID: "71730cb0-0d62-496c-b20a-590bc258489b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.280078 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqqtg\" (UniqueName: \"kubernetes.io/projected/71730cb0-0d62-496c-b20a-590bc258489b-kube-api-access-pqqtg\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.283915 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.283975 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.283990 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71730cb0-0d62-496c-b20a-590bc258489b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.751582 4795 generic.go:334] "Generic (PLEG): container finished" podID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" containerID="02b91837f0ddc7d9ab267bf7d71d729c37ae5422cf7fed3035a9ff30fcca558e" exitCode=0 Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.751668 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" event={"ID":"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a","Type":"ContainerDied","Data":"02b91837f0ddc7d9ab267bf7d71d729c37ae5422cf7fed3035a9ff30fcca558e"} Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.752016 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" event={"ID":"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a","Type":"ContainerStarted","Data":"759043739bccdb4e9d8810a99089303a4dcf977d51cfcc31ca61096dfaef3dbf"} Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.753818 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.753950 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f9dfcfb9-bslgg" event={"ID":"71730cb0-0d62-496c-b20a-590bc258489b","Type":"ContainerDied","Data":"514fd9cf80faa56b01c5fba3e6443cc446339d2131fc32a8758f5440873bb6c7"} Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.754731 4795 scope.go:117] "RemoveContainer" containerID="95a83942347a601497cb76f3b33f1eec961796f6d98935a98ad389d2a4ed0165" Feb 19 22:52:02 crc kubenswrapper[4795]: I0219 22:52:02.992422 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56f9dfcfb9-bslgg"] Feb 19 22:52:03 crc kubenswrapper[4795]: I0219 22:52:03.003577 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56f9dfcfb9-bslgg"] Feb 19 22:52:03 crc kubenswrapper[4795]: I0219 22:52:03.525453 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71730cb0-0d62-496c-b20a-590bc258489b" path="/var/lib/kubelet/pods/71730cb0-0d62-496c-b20a-590bc258489b/volumes" Feb 19 22:52:03 crc kubenswrapper[4795]: I0219 22:52:03.769948 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" event={"ID":"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a","Type":"ContainerStarted","Data":"b0e5926c1df9cf0e08b6183cea78e6fd21fb1a7f2265743460b8010658ff82c0"} Feb 19 22:52:03 crc kubenswrapper[4795]: I0219 22:52:03.770638 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:03 crc kubenswrapper[4795]: I0219 22:52:03.801772 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" podStartSLOduration=2.801745679 podStartE2EDuration="2.801745679s" podCreationTimestamp="2026-02-19 22:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:03.796790037 +0000 UTC m=+5034.989307921" watchObservedRunningTime="2026-02-19 22:52:03.801745679 +0000 UTC m=+5034.994263553" Feb 19 22:52:04 crc kubenswrapper[4795]: I0219 22:52:04.849906 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Feb 19 22:52:04 crc kubenswrapper[4795]: E0219 22:52:04.850284 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71730cb0-0d62-496c-b20a-590bc258489b" containerName="init" Feb 19 22:52:04 crc kubenswrapper[4795]: I0219 22:52:04.850302 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="71730cb0-0d62-496c-b20a-590bc258489b" containerName="init" Feb 19 22:52:04 crc kubenswrapper[4795]: I0219 22:52:04.850597 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="71730cb0-0d62-496c-b20a-590bc258489b" containerName="init" Feb 19 22:52:04 crc kubenswrapper[4795]: I0219 22:52:04.851763 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 19 22:52:04 crc kubenswrapper[4795]: I0219 22:52:04.854010 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Feb 19 22:52:04 crc kubenswrapper[4795]: I0219 22:52:04.863927 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.031849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/59d77cc9-140e-4468-9023-0a973155d290-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.032131 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcjlg\" (UniqueName: \"kubernetes.io/projected/59d77cc9-140e-4468-9023-0a973155d290-kube-api-access-jcjlg\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.032245 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.133192 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcjlg\" (UniqueName: \"kubernetes.io/projected/59d77cc9-140e-4468-9023-0a973155d290-kube-api-access-jcjlg\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.133446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.133733 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/59d77cc9-140e-4468-9023-0a973155d290-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.137480 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.137546 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/272a78958a7306caa675bdcba31e4abf2af9d4231e8b5f084d7a94734d563c34/globalmount\"" pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.148628 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/59d77cc9-140e-4468-9023-0a973155d290-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.153527 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcjlg\" (UniqueName: \"kubernetes.io/projected/59d77cc9-140e-4468-9023-0a973155d290-kube-api-access-jcjlg\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.178334 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\") pod \"ovn-copy-data\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.476943 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 19 22:52:05 crc kubenswrapper[4795]: I0219 22:52:05.982056 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 19 22:52:05 crc kubenswrapper[4795]: W0219 22:52:05.983653 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59d77cc9_140e_4468_9023_0a973155d290.slice/crio-d1d8d1a7f24d4b32f5d50dbc56c8c75a39927408e5337eb8542c8ba059a2e9d8 WatchSource:0}: Error finding container d1d8d1a7f24d4b32f5d50dbc56c8c75a39927408e5337eb8542c8ba059a2e9d8: Status 404 returned error can't find the container with id d1d8d1a7f24d4b32f5d50dbc56c8c75a39927408e5337eb8542c8ba059a2e9d8 Feb 19 22:52:06 crc kubenswrapper[4795]: I0219 22:52:06.796788 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"59d77cc9-140e-4468-9023-0a973155d290","Type":"ContainerStarted","Data":"a94a997e126652369a7f539bdbf820b6c97b6808304fc5ab088e5b94e32df40f"} Feb 19 22:52:06 crc kubenswrapper[4795]: I0219 22:52:06.797097 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"59d77cc9-140e-4468-9023-0a973155d290","Type":"ContainerStarted","Data":"d1d8d1a7f24d4b32f5d50dbc56c8c75a39927408e5337eb8542c8ba059a2e9d8"} Feb 19 22:52:06 crc kubenswrapper[4795]: I0219 22:52:06.832306 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.379894209 podStartE2EDuration="3.832279446s" podCreationTimestamp="2026-02-19 22:52:03 +0000 UTC" firstStartedPulling="2026-02-19 22:52:05.986335149 +0000 UTC m=+5037.178853013" lastFinishedPulling="2026-02-19 22:52:06.438720386 +0000 UTC m=+5037.631238250" observedRunningTime="2026-02-19 22:52:06.820516579 +0000 UTC m=+5038.013034483" watchObservedRunningTime="2026-02-19 22:52:06.832279446 +0000 UTC m=+5038.024797350" Feb 19 22:52:09 crc kubenswrapper[4795]: E0219 22:52:09.076297 4795 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.69:60290->38.102.83.69:37561: read tcp 38.102.83.69:60290->38.102.83.69:37561: read: connection reset by peer Feb 19 22:52:11 crc kubenswrapper[4795]: I0219 22:52:11.496547 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:11 crc kubenswrapper[4795]: I0219 22:52:11.609331 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-wgn4d"] Feb 19 22:52:11 crc kubenswrapper[4795]: I0219 22:52:11.609597 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" podUID="00c8c1c0-da57-4169-a42c-b52386ed3112" containerName="dnsmasq-dns" containerID="cri-o://00f889ab0dea3cf5542a0ac31c64828fab8a4969385f74afc960d2da8d71af69" gracePeriod=10 Feb 19 22:52:11 crc kubenswrapper[4795]: I0219 22:52:11.842589 4795 generic.go:334] "Generic (PLEG): container finished" podID="00c8c1c0-da57-4169-a42c-b52386ed3112" containerID="00f889ab0dea3cf5542a0ac31c64828fab8a4969385f74afc960d2da8d71af69" exitCode=0 Feb 19 22:52:11 crc kubenswrapper[4795]: I0219 22:52:11.842632 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" event={"ID":"00c8c1c0-da57-4169-a42c-b52386ed3112","Type":"ContainerDied","Data":"00f889ab0dea3cf5542a0ac31c64828fab8a4969385f74afc960d2da8d71af69"} Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.046814 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.145036 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-config\") pod \"00c8c1c0-da57-4169-a42c-b52386ed3112\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.145126 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpmqb\" (UniqueName: \"kubernetes.io/projected/00c8c1c0-da57-4169-a42c-b52386ed3112-kube-api-access-vpmqb\") pod \"00c8c1c0-da57-4169-a42c-b52386ed3112\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.145152 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-dns-svc\") pod \"00c8c1c0-da57-4169-a42c-b52386ed3112\" (UID: \"00c8c1c0-da57-4169-a42c-b52386ed3112\") " Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.149892 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00c8c1c0-da57-4169-a42c-b52386ed3112-kube-api-access-vpmqb" (OuterVolumeSpecName: "kube-api-access-vpmqb") pod "00c8c1c0-da57-4169-a42c-b52386ed3112" (UID: "00c8c1c0-da57-4169-a42c-b52386ed3112"). InnerVolumeSpecName "kube-api-access-vpmqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.183022 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-config" (OuterVolumeSpecName: "config") pod "00c8c1c0-da57-4169-a42c-b52386ed3112" (UID: "00c8c1c0-da57-4169-a42c-b52386ed3112"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.183086 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00c8c1c0-da57-4169-a42c-b52386ed3112" (UID: "00c8c1c0-da57-4169-a42c-b52386ed3112"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.246954 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.247009 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpmqb\" (UniqueName: \"kubernetes.io/projected/00c8c1c0-da57-4169-a42c-b52386ed3112-kube-api-access-vpmqb\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.247019 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00c8c1c0-da57-4169-a42c-b52386ed3112-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.373213 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 19 22:52:12 crc kubenswrapper[4795]: E0219 22:52:12.373504 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00c8c1c0-da57-4169-a42c-b52386ed3112" containerName="init" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.373516 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c8c1c0-da57-4169-a42c-b52386ed3112" containerName="init" Feb 19 22:52:12 crc kubenswrapper[4795]: E0219 22:52:12.373533 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00c8c1c0-da57-4169-a42c-b52386ed3112" containerName="dnsmasq-dns" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.373540 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c8c1c0-da57-4169-a42c-b52386ed3112" containerName="dnsmasq-dns" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.373682 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="00c8c1c0-da57-4169-a42c-b52386ed3112" containerName="dnsmasq-dns" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.374429 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.376294 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-54jvp" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.376612 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.381805 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.395759 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.551598 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b989c1be-7a74-42ee-a27b-dc34ce8d727a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.551642 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b989c1be-7a74-42ee-a27b-dc34ce8d727a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.551682 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b989c1be-7a74-42ee-a27b-dc34ce8d727a-scripts\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.551818 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b989c1be-7a74-42ee-a27b-dc34ce8d727a-config\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.551863 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwv7w\" (UniqueName: \"kubernetes.io/projected/b989c1be-7a74-42ee-a27b-dc34ce8d727a-kube-api-access-kwv7w\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.653704 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwv7w\" (UniqueName: \"kubernetes.io/projected/b989c1be-7a74-42ee-a27b-dc34ce8d727a-kube-api-access-kwv7w\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.653866 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b989c1be-7a74-42ee-a27b-dc34ce8d727a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.653900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b989c1be-7a74-42ee-a27b-dc34ce8d727a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.653950 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b989c1be-7a74-42ee-a27b-dc34ce8d727a-scripts\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.654420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b989c1be-7a74-42ee-a27b-dc34ce8d727a-config\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.654816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b989c1be-7a74-42ee-a27b-dc34ce8d727a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.654889 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b989c1be-7a74-42ee-a27b-dc34ce8d727a-scripts\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.655800 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b989c1be-7a74-42ee-a27b-dc34ce8d727a-config\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.657243 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b989c1be-7a74-42ee-a27b-dc34ce8d727a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.681583 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwv7w\" (UniqueName: \"kubernetes.io/projected/b989c1be-7a74-42ee-a27b-dc34ce8d727a-kube-api-access-kwv7w\") pod \"ovn-northd-0\" (UID: \"b989c1be-7a74-42ee-a27b-dc34ce8d727a\") " pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.692565 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.851684 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" event={"ID":"00c8c1c0-da57-4169-a42c-b52386ed3112","Type":"ContainerDied","Data":"188ca574ffaf1ffa388763be3f13a7eb4afedf6a896f0e7de263093402b89351"} Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.851944 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dc9c94cc-wgn4d" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.851974 4795 scope.go:117] "RemoveContainer" containerID="00f889ab0dea3cf5542a0ac31c64828fab8a4969385f74afc960d2da8d71af69" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.876210 4795 scope.go:117] "RemoveContainer" containerID="99d134900f88fca6c416350229ec6ebd6aa32403658dd7fa964885fc2b39579a" Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.897182 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-wgn4d"] Feb 19 22:52:12 crc kubenswrapper[4795]: I0219 22:52:12.902888 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-wgn4d"] Feb 19 22:52:13 crc kubenswrapper[4795]: I0219 22:52:13.118088 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 22:52:13 crc kubenswrapper[4795]: I0219 22:52:13.521449 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00c8c1c0-da57-4169-a42c-b52386ed3112" path="/var/lib/kubelet/pods/00c8c1c0-da57-4169-a42c-b52386ed3112/volumes" Feb 19 22:52:13 crc kubenswrapper[4795]: I0219 22:52:13.863053 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b989c1be-7a74-42ee-a27b-dc34ce8d727a","Type":"ContainerStarted","Data":"0c600f96eff9c987bbf1c42f97977ddf8c53ab241d1f8c672da01b40b928914e"} Feb 19 22:52:13 crc kubenswrapper[4795]: I0219 22:52:13.863473 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 19 22:52:13 crc kubenswrapper[4795]: I0219 22:52:13.863496 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b989c1be-7a74-42ee-a27b-dc34ce8d727a","Type":"ContainerStarted","Data":"c57eac7d730fe0b608bf5efe642341194773ab7bf8b980686440d17b6be9377d"} Feb 19 22:52:13 crc kubenswrapper[4795]: I0219 22:52:13.863506 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b989c1be-7a74-42ee-a27b-dc34ce8d727a","Type":"ContainerStarted","Data":"1975f0327f0564a3a40276b367239b249c816055a1dcd73225c90dc3f0fa2d3b"} Feb 19 22:52:13 crc kubenswrapper[4795]: I0219 22:52:13.891814 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.8917904079999999 podStartE2EDuration="1.891790408s" podCreationTimestamp="2026-02-19 22:52:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:13.881795021 +0000 UTC m=+5045.074312905" watchObservedRunningTime="2026-02-19 22:52:13.891790408 +0000 UTC m=+5045.084308282" Feb 19 22:52:14 crc kubenswrapper[4795]: I0219 22:52:14.512179 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:52:14 crc kubenswrapper[4795]: E0219 22:52:14.512619 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.271331 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-pgc5v"] Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.272615 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.293604 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pgc5v"] Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.308524 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-af99-account-create-update-5rdfd"] Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.309940 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.320072 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.340361 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-af99-account-create-update-5rdfd"] Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.471106 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/036fd6f7-0c88-4c92-9a98-0a774124c8fd-operator-scripts\") pod \"keystone-db-create-pgc5v\" (UID: \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\") " pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.471314 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zczw7\" (UniqueName: \"kubernetes.io/projected/036fd6f7-0c88-4c92-9a98-0a774124c8fd-kube-api-access-zczw7\") pod \"keystone-db-create-pgc5v\" (UID: \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\") " pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.471423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5rc4\" (UniqueName: \"kubernetes.io/projected/d1e0382a-40d3-42e1-93d3-e5098af1e54f-kube-api-access-v5rc4\") pod \"keystone-af99-account-create-update-5rdfd\" (UID: \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\") " pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.471470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1e0382a-40d3-42e1-93d3-e5098af1e54f-operator-scripts\") pod \"keystone-af99-account-create-update-5rdfd\" (UID: \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\") " pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.573512 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/036fd6f7-0c88-4c92-9a98-0a774124c8fd-operator-scripts\") pod \"keystone-db-create-pgc5v\" (UID: \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\") " pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.573736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zczw7\" (UniqueName: \"kubernetes.io/projected/036fd6f7-0c88-4c92-9a98-0a774124c8fd-kube-api-access-zczw7\") pod \"keystone-db-create-pgc5v\" (UID: \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\") " pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.573890 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5rc4\" (UniqueName: \"kubernetes.io/projected/d1e0382a-40d3-42e1-93d3-e5098af1e54f-kube-api-access-v5rc4\") pod \"keystone-af99-account-create-update-5rdfd\" (UID: \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\") " pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.573963 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1e0382a-40d3-42e1-93d3-e5098af1e54f-operator-scripts\") pod \"keystone-af99-account-create-update-5rdfd\" (UID: \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\") " pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.574290 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/036fd6f7-0c88-4c92-9a98-0a774124c8fd-operator-scripts\") pod \"keystone-db-create-pgc5v\" (UID: \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\") " pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.576030 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1e0382a-40d3-42e1-93d3-e5098af1e54f-operator-scripts\") pod \"keystone-af99-account-create-update-5rdfd\" (UID: \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\") " pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.600246 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5rc4\" (UniqueName: \"kubernetes.io/projected/d1e0382a-40d3-42e1-93d3-e5098af1e54f-kube-api-access-v5rc4\") pod \"keystone-af99-account-create-update-5rdfd\" (UID: \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\") " pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.600878 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zczw7\" (UniqueName: \"kubernetes.io/projected/036fd6f7-0c88-4c92-9a98-0a774124c8fd-kube-api-access-zczw7\") pod \"keystone-db-create-pgc5v\" (UID: \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\") " pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.643991 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:18 crc kubenswrapper[4795]: I0219 22:52:18.891820 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:19 crc kubenswrapper[4795]: I0219 22:52:19.120635 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-af99-account-create-update-5rdfd"] Feb 19 22:52:19 crc kubenswrapper[4795]: W0219 22:52:19.128777 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1e0382a_40d3_42e1_93d3_e5098af1e54f.slice/crio-78ef92caa6d040d97226723b78da3eb0edd275fd2ef294ec591c2bcce83c400c WatchSource:0}: Error finding container 78ef92caa6d040d97226723b78da3eb0edd275fd2ef294ec591c2bcce83c400c: Status 404 returned error can't find the container with id 78ef92caa6d040d97226723b78da3eb0edd275fd2ef294ec591c2bcce83c400c Feb 19 22:52:19 crc kubenswrapper[4795]: I0219 22:52:19.335743 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pgc5v"] Feb 19 22:52:19 crc kubenswrapper[4795]: I0219 22:52:19.913542 4795 generic.go:334] "Generic (PLEG): container finished" podID="036fd6f7-0c88-4c92-9a98-0a774124c8fd" containerID="e2855b43837d8ca73ce1e28756facf9b648ea7b22ec3a7d6bc132133d332b880" exitCode=0 Feb 19 22:52:19 crc kubenswrapper[4795]: I0219 22:52:19.914984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pgc5v" event={"ID":"036fd6f7-0c88-4c92-9a98-0a774124c8fd","Type":"ContainerDied","Data":"e2855b43837d8ca73ce1e28756facf9b648ea7b22ec3a7d6bc132133d332b880"} Feb 19 22:52:19 crc kubenswrapper[4795]: I0219 22:52:19.915050 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pgc5v" event={"ID":"036fd6f7-0c88-4c92-9a98-0a774124c8fd","Type":"ContainerStarted","Data":"195a684b2f3b5afb8957fbe27e13ee24b478b3afdc0c86ac9506472105a02b91"} Feb 19 22:52:19 crc kubenswrapper[4795]: I0219 22:52:19.915636 4795 generic.go:334] "Generic (PLEG): container finished" podID="d1e0382a-40d3-42e1-93d3-e5098af1e54f" containerID="f711a9de7781799454b8f0272a2636f2533360c8ba6ce3a134936f4cf9908d61" exitCode=0 Feb 19 22:52:19 crc kubenswrapper[4795]: I0219 22:52:19.915675 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-af99-account-create-update-5rdfd" event={"ID":"d1e0382a-40d3-42e1-93d3-e5098af1e54f","Type":"ContainerDied","Data":"f711a9de7781799454b8f0272a2636f2533360c8ba6ce3a134936f4cf9908d61"} Feb 19 22:52:19 crc kubenswrapper[4795]: I0219 22:52:19.915694 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-af99-account-create-update-5rdfd" event={"ID":"d1e0382a-40d3-42e1-93d3-e5098af1e54f","Type":"ContainerStarted","Data":"78ef92caa6d040d97226723b78da3eb0edd275fd2ef294ec591c2bcce83c400c"} Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.225238 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.322274 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zczw7\" (UniqueName: \"kubernetes.io/projected/036fd6f7-0c88-4c92-9a98-0a774124c8fd-kube-api-access-zczw7\") pod \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\" (UID: \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\") " Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.322440 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/036fd6f7-0c88-4c92-9a98-0a774124c8fd-operator-scripts\") pod \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\" (UID: \"036fd6f7-0c88-4c92-9a98-0a774124c8fd\") " Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.323099 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/036fd6f7-0c88-4c92-9a98-0a774124c8fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "036fd6f7-0c88-4c92-9a98-0a774124c8fd" (UID: "036fd6f7-0c88-4c92-9a98-0a774124c8fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.327955 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/036fd6f7-0c88-4c92-9a98-0a774124c8fd-kube-api-access-zczw7" (OuterVolumeSpecName: "kube-api-access-zczw7") pod "036fd6f7-0c88-4c92-9a98-0a774124c8fd" (UID: "036fd6f7-0c88-4c92-9a98-0a774124c8fd"). InnerVolumeSpecName "kube-api-access-zczw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.385074 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.423705 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1e0382a-40d3-42e1-93d3-e5098af1e54f-operator-scripts\") pod \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\" (UID: \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\") " Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.424033 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5rc4\" (UniqueName: \"kubernetes.io/projected/d1e0382a-40d3-42e1-93d3-e5098af1e54f-kube-api-access-v5rc4\") pod \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\" (UID: \"d1e0382a-40d3-42e1-93d3-e5098af1e54f\") " Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.424406 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zczw7\" (UniqueName: \"kubernetes.io/projected/036fd6f7-0c88-4c92-9a98-0a774124c8fd-kube-api-access-zczw7\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.424444 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/036fd6f7-0c88-4c92-9a98-0a774124c8fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.424682 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1e0382a-40d3-42e1-93d3-e5098af1e54f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d1e0382a-40d3-42e1-93d3-e5098af1e54f" (UID: "d1e0382a-40d3-42e1-93d3-e5098af1e54f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.427000 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e0382a-40d3-42e1-93d3-e5098af1e54f-kube-api-access-v5rc4" (OuterVolumeSpecName: "kube-api-access-v5rc4") pod "d1e0382a-40d3-42e1-93d3-e5098af1e54f" (UID: "d1e0382a-40d3-42e1-93d3-e5098af1e54f"). InnerVolumeSpecName "kube-api-access-v5rc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.525549 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1e0382a-40d3-42e1-93d3-e5098af1e54f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.525587 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5rc4\" (UniqueName: \"kubernetes.io/projected/d1e0382a-40d3-42e1-93d3-e5098af1e54f-kube-api-access-v5rc4\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.932454 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pgc5v" event={"ID":"036fd6f7-0c88-4c92-9a98-0a774124c8fd","Type":"ContainerDied","Data":"195a684b2f3b5afb8957fbe27e13ee24b478b3afdc0c86ac9506472105a02b91"} Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.932501 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="195a684b2f3b5afb8957fbe27e13ee24b478b3afdc0c86ac9506472105a02b91" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.932783 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pgc5v" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.934323 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-af99-account-create-update-5rdfd" event={"ID":"d1e0382a-40d3-42e1-93d3-e5098af1e54f","Type":"ContainerDied","Data":"78ef92caa6d040d97226723b78da3eb0edd275fd2ef294ec591c2bcce83c400c"} Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.934357 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78ef92caa6d040d97226723b78da3eb0edd275fd2ef294ec591c2bcce83c400c" Feb 19 22:52:21 crc kubenswrapper[4795]: I0219 22:52:21.934361 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-af99-account-create-update-5rdfd" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.769987 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-bm6ln"] Feb 19 22:52:23 crc kubenswrapper[4795]: E0219 22:52:23.770711 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036fd6f7-0c88-4c92-9a98-0a774124c8fd" containerName="mariadb-database-create" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.770727 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="036fd6f7-0c88-4c92-9a98-0a774124c8fd" containerName="mariadb-database-create" Feb 19 22:52:23 crc kubenswrapper[4795]: E0219 22:52:23.770761 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e0382a-40d3-42e1-93d3-e5098af1e54f" containerName="mariadb-account-create-update" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.770769 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e0382a-40d3-42e1-93d3-e5098af1e54f" containerName="mariadb-account-create-update" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.770960 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1e0382a-40d3-42e1-93d3-e5098af1e54f" containerName="mariadb-account-create-update" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.770990 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="036fd6f7-0c88-4c92-9a98-0a774124c8fd" containerName="mariadb-database-create" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.771637 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.774413 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.774557 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bpdln" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.775050 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.776465 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.786097 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bm6ln"] Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.962491 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrw9v\" (UniqueName: \"kubernetes.io/projected/96625ae6-8eb0-43d0-a180-20c79dfd6717-kube-api-access-wrw9v\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.962736 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-config-data\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:23 crc kubenswrapper[4795]: I0219 22:52:23.962779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-combined-ca-bundle\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.064532 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrw9v\" (UniqueName: \"kubernetes.io/projected/96625ae6-8eb0-43d0-a180-20c79dfd6717-kube-api-access-wrw9v\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.064697 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-config-data\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.064721 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-combined-ca-bundle\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.072452 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-combined-ca-bundle\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.074355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-config-data\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.086730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrw9v\" (UniqueName: \"kubernetes.io/projected/96625ae6-8eb0-43d0-a180-20c79dfd6717-kube-api-access-wrw9v\") pod \"keystone-db-sync-bm6ln\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.092876 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.543364 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bm6ln"] Feb 19 22:52:24 crc kubenswrapper[4795]: W0219 22:52:24.553295 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96625ae6_8eb0_43d0_a180_20c79dfd6717.slice/crio-b94d855fad5a15c8ec84580ccd5857e68c5f03ee7e5a4a3d61a2c5f61bdf3807 WatchSource:0}: Error finding container b94d855fad5a15c8ec84580ccd5857e68c5f03ee7e5a4a3d61a2c5f61bdf3807: Status 404 returned error can't find the container with id b94d855fad5a15c8ec84580ccd5857e68c5f03ee7e5a4a3d61a2c5f61bdf3807 Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.967672 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bm6ln" event={"ID":"96625ae6-8eb0-43d0-a180-20c79dfd6717","Type":"ContainerStarted","Data":"000fa8f747858d495d2c6d2c850ff5f2717d6c22150f68c950eeae0bca0bd7f7"} Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.968012 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bm6ln" event={"ID":"96625ae6-8eb0-43d0-a180-20c79dfd6717","Type":"ContainerStarted","Data":"b94d855fad5a15c8ec84580ccd5857e68c5f03ee7e5a4a3d61a2c5f61bdf3807"} Feb 19 22:52:24 crc kubenswrapper[4795]: I0219 22:52:24.991810 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-bm6ln" podStartSLOduration=1.9917846670000001 podStartE2EDuration="1.991784667s" podCreationTimestamp="2026-02-19 22:52:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:24.984412195 +0000 UTC m=+5056.176930099" watchObservedRunningTime="2026-02-19 22:52:24.991784667 +0000 UTC m=+5056.184302551" Feb 19 22:52:26 crc kubenswrapper[4795]: I0219 22:52:26.987475 4795 generic.go:334] "Generic (PLEG): container finished" podID="96625ae6-8eb0-43d0-a180-20c79dfd6717" containerID="000fa8f747858d495d2c6d2c850ff5f2717d6c22150f68c950eeae0bca0bd7f7" exitCode=0 Feb 19 22:52:26 crc kubenswrapper[4795]: I0219 22:52:26.987554 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bm6ln" event={"ID":"96625ae6-8eb0-43d0-a180-20c79dfd6717","Type":"ContainerDied","Data":"000fa8f747858d495d2c6d2c850ff5f2717d6c22150f68c950eeae0bca0bd7f7"} Feb 19 22:52:27 crc kubenswrapper[4795]: I0219 22:52:27.514137 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:52:27 crc kubenswrapper[4795]: E0219 22:52:27.515647 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.402701 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.571103 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrw9v\" (UniqueName: \"kubernetes.io/projected/96625ae6-8eb0-43d0-a180-20c79dfd6717-kube-api-access-wrw9v\") pod \"96625ae6-8eb0-43d0-a180-20c79dfd6717\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.571386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-config-data\") pod \"96625ae6-8eb0-43d0-a180-20c79dfd6717\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.571551 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-combined-ca-bundle\") pod \"96625ae6-8eb0-43d0-a180-20c79dfd6717\" (UID: \"96625ae6-8eb0-43d0-a180-20c79dfd6717\") " Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.579450 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96625ae6-8eb0-43d0-a180-20c79dfd6717-kube-api-access-wrw9v" (OuterVolumeSpecName: "kube-api-access-wrw9v") pod "96625ae6-8eb0-43d0-a180-20c79dfd6717" (UID: "96625ae6-8eb0-43d0-a180-20c79dfd6717"). InnerVolumeSpecName "kube-api-access-wrw9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.599711 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96625ae6-8eb0-43d0-a180-20c79dfd6717" (UID: "96625ae6-8eb0-43d0-a180-20c79dfd6717"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.640393 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-config-data" (OuterVolumeSpecName: "config-data") pod "96625ae6-8eb0-43d0-a180-20c79dfd6717" (UID: "96625ae6-8eb0-43d0-a180-20c79dfd6717"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.673580 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.673630 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrw9v\" (UniqueName: \"kubernetes.io/projected/96625ae6-8eb0-43d0-a180-20c79dfd6717-kube-api-access-wrw9v\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:28 crc kubenswrapper[4795]: I0219 22:52:28.673652 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96625ae6-8eb0-43d0-a180-20c79dfd6717-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.013768 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bm6ln" event={"ID":"96625ae6-8eb0-43d0-a180-20c79dfd6717","Type":"ContainerDied","Data":"b94d855fad5a15c8ec84580ccd5857e68c5f03ee7e5a4a3d61a2c5f61bdf3807"} Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.013804 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b94d855fad5a15c8ec84580ccd5857e68c5f03ee7e5a4a3d61a2c5f61bdf3807" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.013905 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bm6ln" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.252620 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cbd7f9ccc-v47nm"] Feb 19 22:52:29 crc kubenswrapper[4795]: E0219 22:52:29.252986 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96625ae6-8eb0-43d0-a180-20c79dfd6717" containerName="keystone-db-sync" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.253012 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="96625ae6-8eb0-43d0-a180-20c79dfd6717" containerName="keystone-db-sync" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.253350 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="96625ae6-8eb0-43d0-a180-20c79dfd6717" containerName="keystone-db-sync" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.254636 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.274038 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cbd7f9ccc-v47nm"] Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.307058 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-26wvp"] Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.308221 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.312374 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.312697 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.312529 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bpdln" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.312529 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.312543 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.326105 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-26wvp"] Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.386867 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-sb\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.386927 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-config\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.386959 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g49lp\" (UniqueName: \"kubernetes.io/projected/74bde2c2-542d-4473-8a2d-4276ef12f1a1-kube-api-access-g49lp\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.387031 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hjfr\" (UniqueName: \"kubernetes.io/projected/415c9781-58d2-447a-8e0c-2fed3a02ef09-kube-api-access-6hjfr\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.387054 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-config-data\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.387086 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-credential-keys\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.387139 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-nb\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.387191 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-scripts\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.387219 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-fernet-keys\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.387260 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-combined-ca-bundle\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.387292 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-dns-svc\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.488068 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-nb\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489010 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-scripts\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489104 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-fernet-keys\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489201 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-nb\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489309 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-combined-ca-bundle\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489407 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-dns-svc\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489512 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-sb\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489581 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-config\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489661 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g49lp\" (UniqueName: \"kubernetes.io/projected/74bde2c2-542d-4473-8a2d-4276ef12f1a1-kube-api-access-g49lp\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489731 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hjfr\" (UniqueName: \"kubernetes.io/projected/415c9781-58d2-447a-8e0c-2fed3a02ef09-kube-api-access-6hjfr\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489808 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-config-data\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.489886 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-credential-keys\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.490260 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-dns-svc\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.490366 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-config\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.490574 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-sb\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.492885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-scripts\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.493181 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-fernet-keys\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.493313 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-config-data\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.493419 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-credential-keys\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.494125 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-combined-ca-bundle\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.505321 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g49lp\" (UniqueName: \"kubernetes.io/projected/74bde2c2-542d-4473-8a2d-4276ef12f1a1-kube-api-access-g49lp\") pod \"dnsmasq-dns-5cbd7f9ccc-v47nm\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.505745 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hjfr\" (UniqueName: \"kubernetes.io/projected/415c9781-58d2-447a-8e0c-2fed3a02ef09-kube-api-access-6hjfr\") pod \"keystone-bootstrap-26wvp\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.580005 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:29 crc kubenswrapper[4795]: I0219 22:52:29.657611 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:30 crc kubenswrapper[4795]: W0219 22:52:30.034946 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74bde2c2_542d_4473_8a2d_4276ef12f1a1.slice/crio-bf13f0e9bcad39a75b168c83b41452ae37c85cfd3c851a7d568e51da749b6e4f WatchSource:0}: Error finding container bf13f0e9bcad39a75b168c83b41452ae37c85cfd3c851a7d568e51da749b6e4f: Status 404 returned error can't find the container with id bf13f0e9bcad39a75b168c83b41452ae37c85cfd3c851a7d568e51da749b6e4f Feb 19 22:52:30 crc kubenswrapper[4795]: I0219 22:52:30.035868 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cbd7f9ccc-v47nm"] Feb 19 22:52:30 crc kubenswrapper[4795]: I0219 22:52:30.108692 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-26wvp"] Feb 19 22:52:31 crc kubenswrapper[4795]: I0219 22:52:31.033513 4795 generic.go:334] "Generic (PLEG): container finished" podID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" containerID="16b74941371be4fa54bab7807a1264eec35a2ff59fcbcca048e96bbfbf300be4" exitCode=0 Feb 19 22:52:31 crc kubenswrapper[4795]: I0219 22:52:31.033593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" event={"ID":"74bde2c2-542d-4473-8a2d-4276ef12f1a1","Type":"ContainerDied","Data":"16b74941371be4fa54bab7807a1264eec35a2ff59fcbcca048e96bbfbf300be4"} Feb 19 22:52:31 crc kubenswrapper[4795]: I0219 22:52:31.033957 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" event={"ID":"74bde2c2-542d-4473-8a2d-4276ef12f1a1","Type":"ContainerStarted","Data":"bf13f0e9bcad39a75b168c83b41452ae37c85cfd3c851a7d568e51da749b6e4f"} Feb 19 22:52:31 crc kubenswrapper[4795]: I0219 22:52:31.035798 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-26wvp" event={"ID":"415c9781-58d2-447a-8e0c-2fed3a02ef09","Type":"ContainerStarted","Data":"ccef51520029c7e6d5f912ec1258a3f7218e74073cc8ab1ae05322f167a9a678"} Feb 19 22:52:31 crc kubenswrapper[4795]: I0219 22:52:31.035857 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-26wvp" event={"ID":"415c9781-58d2-447a-8e0c-2fed3a02ef09","Type":"ContainerStarted","Data":"3c586b18913896c81e735ec4b2b818346aead905ba07882b9ec1c256d7a50791"} Feb 19 22:52:31 crc kubenswrapper[4795]: I0219 22:52:31.097610 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-26wvp" podStartSLOduration=2.097583323 podStartE2EDuration="2.097583323s" podCreationTimestamp="2026-02-19 22:52:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:31.086757012 +0000 UTC m=+5062.279274886" watchObservedRunningTime="2026-02-19 22:52:31.097583323 +0000 UTC m=+5062.290101227" Feb 19 22:52:32 crc kubenswrapper[4795]: I0219 22:52:32.045489 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" event={"ID":"74bde2c2-542d-4473-8a2d-4276ef12f1a1","Type":"ContainerStarted","Data":"1e453cedb2505c25dd86ba43beea6d20a98c7545840d561df3aaa00ec8232701"} Feb 19 22:52:32 crc kubenswrapper[4795]: I0219 22:52:32.071202 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" podStartSLOduration=3.071183499 podStartE2EDuration="3.071183499s" podCreationTimestamp="2026-02-19 22:52:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:32.062556622 +0000 UTC m=+5063.255074486" watchObservedRunningTime="2026-02-19 22:52:32.071183499 +0000 UTC m=+5063.263701363" Feb 19 22:52:32 crc kubenswrapper[4795]: I0219 22:52:32.769644 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 19 22:52:33 crc kubenswrapper[4795]: I0219 22:52:33.056620 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:34 crc kubenswrapper[4795]: I0219 22:52:34.067985 4795 generic.go:334] "Generic (PLEG): container finished" podID="415c9781-58d2-447a-8e0c-2fed3a02ef09" containerID="ccef51520029c7e6d5f912ec1258a3f7218e74073cc8ab1ae05322f167a9a678" exitCode=0 Feb 19 22:52:34 crc kubenswrapper[4795]: I0219 22:52:34.068099 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-26wvp" event={"ID":"415c9781-58d2-447a-8e0c-2fed3a02ef09","Type":"ContainerDied","Data":"ccef51520029c7e6d5f912ec1258a3f7218e74073cc8ab1ae05322f167a9a678"} Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.444541 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.517468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-credential-keys\") pod \"415c9781-58d2-447a-8e0c-2fed3a02ef09\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.517560 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-combined-ca-bundle\") pod \"415c9781-58d2-447a-8e0c-2fed3a02ef09\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.517662 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-scripts\") pod \"415c9781-58d2-447a-8e0c-2fed3a02ef09\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.517806 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hjfr\" (UniqueName: \"kubernetes.io/projected/415c9781-58d2-447a-8e0c-2fed3a02ef09-kube-api-access-6hjfr\") pod \"415c9781-58d2-447a-8e0c-2fed3a02ef09\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.517868 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-config-data\") pod \"415c9781-58d2-447a-8e0c-2fed3a02ef09\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.517925 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-fernet-keys\") pod \"415c9781-58d2-447a-8e0c-2fed3a02ef09\" (UID: \"415c9781-58d2-447a-8e0c-2fed3a02ef09\") " Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.523859 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/415c9781-58d2-447a-8e0c-2fed3a02ef09-kube-api-access-6hjfr" (OuterVolumeSpecName: "kube-api-access-6hjfr") pod "415c9781-58d2-447a-8e0c-2fed3a02ef09" (UID: "415c9781-58d2-447a-8e0c-2fed3a02ef09"). InnerVolumeSpecName "kube-api-access-6hjfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.532309 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "415c9781-58d2-447a-8e0c-2fed3a02ef09" (UID: "415c9781-58d2-447a-8e0c-2fed3a02ef09"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.534434 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-scripts" (OuterVolumeSpecName: "scripts") pod "415c9781-58d2-447a-8e0c-2fed3a02ef09" (UID: "415c9781-58d2-447a-8e0c-2fed3a02ef09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.537216 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "415c9781-58d2-447a-8e0c-2fed3a02ef09" (UID: "415c9781-58d2-447a-8e0c-2fed3a02ef09"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.550341 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "415c9781-58d2-447a-8e0c-2fed3a02ef09" (UID: "415c9781-58d2-447a-8e0c-2fed3a02ef09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.559004 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-config-data" (OuterVolumeSpecName: "config-data") pod "415c9781-58d2-447a-8e0c-2fed3a02ef09" (UID: "415c9781-58d2-447a-8e0c-2fed3a02ef09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.620288 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hjfr\" (UniqueName: \"kubernetes.io/projected/415c9781-58d2-447a-8e0c-2fed3a02ef09-kube-api-access-6hjfr\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.620329 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.620341 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.620351 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.620362 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:35 crc kubenswrapper[4795]: I0219 22:52:35.620373 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/415c9781-58d2-447a-8e0c-2fed3a02ef09-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.086549 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-26wvp" event={"ID":"415c9781-58d2-447a-8e0c-2fed3a02ef09","Type":"ContainerDied","Data":"3c586b18913896c81e735ec4b2b818346aead905ba07882b9ec1c256d7a50791"} Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.086598 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c586b18913896c81e735ec4b2b818346aead905ba07882b9ec1c256d7a50791" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.086677 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-26wvp" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.190149 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-26wvp"] Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.202939 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-26wvp"] Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.270847 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-tghht"] Feb 19 22:52:36 crc kubenswrapper[4795]: E0219 22:52:36.271393 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="415c9781-58d2-447a-8e0c-2fed3a02ef09" containerName="keystone-bootstrap" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.271427 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="415c9781-58d2-447a-8e0c-2fed3a02ef09" containerName="keystone-bootstrap" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.271692 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="415c9781-58d2-447a-8e0c-2fed3a02ef09" containerName="keystone-bootstrap" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.272505 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.274725 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.275783 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bpdln" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.276173 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.276430 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.277786 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.284077 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tghht"] Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.330321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-combined-ca-bundle\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.330376 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-config-data\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.330480 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-credential-keys\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.330526 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lqz7\" (UniqueName: \"kubernetes.io/projected/fe89b6c7-308b-42a8-92a9-da093d6bbae4-kube-api-access-6lqz7\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.330552 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-scripts\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.330599 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-fernet-keys\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.432504 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-credential-keys\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.432564 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lqz7\" (UniqueName: \"kubernetes.io/projected/fe89b6c7-308b-42a8-92a9-da093d6bbae4-kube-api-access-6lqz7\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.432586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-scripts\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.432622 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-fernet-keys\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.432699 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-combined-ca-bundle\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.432884 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-config-data\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.437676 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-scripts\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.437718 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-credential-keys\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.437952 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-fernet-keys\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.438061 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-combined-ca-bundle\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.441901 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-config-data\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.455582 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lqz7\" (UniqueName: \"kubernetes.io/projected/fe89b6c7-308b-42a8-92a9-da093d6bbae4-kube-api-access-6lqz7\") pod \"keystone-bootstrap-tghht\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:36 crc kubenswrapper[4795]: I0219 22:52:36.586791 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:37 crc kubenswrapper[4795]: I0219 22:52:37.023588 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tghht"] Feb 19 22:52:37 crc kubenswrapper[4795]: I0219 22:52:37.094646 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tghht" event={"ID":"fe89b6c7-308b-42a8-92a9-da093d6bbae4","Type":"ContainerStarted","Data":"a7cebc2666dd4130e92c6c4d3c68c5b05d24b04e3bc0724f42b292ec6cb3c474"} Feb 19 22:52:37 crc kubenswrapper[4795]: I0219 22:52:37.525645 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="415c9781-58d2-447a-8e0c-2fed3a02ef09" path="/var/lib/kubelet/pods/415c9781-58d2-447a-8e0c-2fed3a02ef09/volumes" Feb 19 22:52:38 crc kubenswrapper[4795]: I0219 22:52:38.131562 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tghht" event={"ID":"fe89b6c7-308b-42a8-92a9-da093d6bbae4","Type":"ContainerStarted","Data":"2119979e9a0784ca3835a55937d4bf0a118f5016d04e68e3c74e02efc1b7df06"} Feb 19 22:52:38 crc kubenswrapper[4795]: I0219 22:52:38.156912 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-tghht" podStartSLOduration=2.15689112 podStartE2EDuration="2.15689112s" podCreationTimestamp="2026-02-19 22:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:38.149791606 +0000 UTC m=+5069.342309500" watchObservedRunningTime="2026-02-19 22:52:38.15689112 +0000 UTC m=+5069.349409004" Feb 19 22:52:38 crc kubenswrapper[4795]: I0219 22:52:38.512001 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:52:38 crc kubenswrapper[4795]: E0219 22:52:38.512603 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:52:39 crc kubenswrapper[4795]: I0219 22:52:39.582235 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:52:39 crc kubenswrapper[4795]: I0219 22:52:39.654340 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc9b786df-rmprl"] Feb 19 22:52:39 crc kubenswrapper[4795]: I0219 22:52:39.654655 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" podUID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" containerName="dnsmasq-dns" containerID="cri-o://b0e5926c1df9cf0e08b6183cea78e6fd21fb1a7f2265743460b8010658ff82c0" gracePeriod=10 Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.160745 4795 generic.go:334] "Generic (PLEG): container finished" podID="fe89b6c7-308b-42a8-92a9-da093d6bbae4" containerID="2119979e9a0784ca3835a55937d4bf0a118f5016d04e68e3c74e02efc1b7df06" exitCode=0 Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.160974 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tghht" event={"ID":"fe89b6c7-308b-42a8-92a9-da093d6bbae4","Type":"ContainerDied","Data":"2119979e9a0784ca3835a55937d4bf0a118f5016d04e68e3c74e02efc1b7df06"} Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.163339 4795 generic.go:334] "Generic (PLEG): container finished" podID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" containerID="b0e5926c1df9cf0e08b6183cea78e6fd21fb1a7f2265743460b8010658ff82c0" exitCode=0 Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.163361 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" event={"ID":"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a","Type":"ContainerDied","Data":"b0e5926c1df9cf0e08b6183cea78e6fd21fb1a7f2265743460b8010658ff82c0"} Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.246002 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.397135 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-nb\") pod \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.397408 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-sb\") pod \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.397445 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrnhl\" (UniqueName: \"kubernetes.io/projected/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-kube-api-access-rrnhl\") pod \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.397491 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-dns-svc\") pod \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.397664 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-config\") pod \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\" (UID: \"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a\") " Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.404290 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-kube-api-access-rrnhl" (OuterVolumeSpecName: "kube-api-access-rrnhl") pod "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" (UID: "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a"). InnerVolumeSpecName "kube-api-access-rrnhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.439307 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-config" (OuterVolumeSpecName: "config") pod "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" (UID: "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.446844 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" (UID: "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.452071 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" (UID: "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.460544 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" (UID: "ee8a52c2-f6ad-4b2e-a092-9393dac0f15a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.499753 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrnhl\" (UniqueName: \"kubernetes.io/projected/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-kube-api-access-rrnhl\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.499785 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.499794 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.499803 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:40 crc kubenswrapper[4795]: I0219 22:52:40.499813 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.181080 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" event={"ID":"ee8a52c2-f6ad-4b2e-a092-9393dac0f15a","Type":"ContainerDied","Data":"759043739bccdb4e9d8810a99089303a4dcf977d51cfcc31ca61096dfaef3dbf"} Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.184494 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc9b786df-rmprl" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.184567 4795 scope.go:117] "RemoveContainer" containerID="b0e5926c1df9cf0e08b6183cea78e6fd21fb1a7f2265743460b8010658ff82c0" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.232559 4795 scope.go:117] "RemoveContainer" containerID="02b91837f0ddc7d9ab267bf7d71d729c37ae5422cf7fed3035a9ff30fcca558e" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.246206 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc9b786df-rmprl"] Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.266642 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dc9b786df-rmprl"] Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.523827 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" path="/var/lib/kubelet/pods/ee8a52c2-f6ad-4b2e-a092-9393dac0f15a/volumes" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.573511 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.719231 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-fernet-keys\") pod \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.719653 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-credential-keys\") pod \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.719689 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-config-data\") pod \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.719732 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lqz7\" (UniqueName: \"kubernetes.io/projected/fe89b6c7-308b-42a8-92a9-da093d6bbae4-kube-api-access-6lqz7\") pod \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.720327 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-scripts\") pod \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.720401 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-combined-ca-bundle\") pod \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\" (UID: \"fe89b6c7-308b-42a8-92a9-da093d6bbae4\") " Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.731246 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fe89b6c7-308b-42a8-92a9-da093d6bbae4" (UID: "fe89b6c7-308b-42a8-92a9-da093d6bbae4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.731935 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-scripts" (OuterVolumeSpecName: "scripts") pod "fe89b6c7-308b-42a8-92a9-da093d6bbae4" (UID: "fe89b6c7-308b-42a8-92a9-da093d6bbae4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.731972 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe89b6c7-308b-42a8-92a9-da093d6bbae4-kube-api-access-6lqz7" (OuterVolumeSpecName: "kube-api-access-6lqz7") pod "fe89b6c7-308b-42a8-92a9-da093d6bbae4" (UID: "fe89b6c7-308b-42a8-92a9-da093d6bbae4"). InnerVolumeSpecName "kube-api-access-6lqz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.732389 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fe89b6c7-308b-42a8-92a9-da093d6bbae4" (UID: "fe89b6c7-308b-42a8-92a9-da093d6bbae4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.739045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-config-data" (OuterVolumeSpecName: "config-data") pod "fe89b6c7-308b-42a8-92a9-da093d6bbae4" (UID: "fe89b6c7-308b-42a8-92a9-da093d6bbae4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.754860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe89b6c7-308b-42a8-92a9-da093d6bbae4" (UID: "fe89b6c7-308b-42a8-92a9-da093d6bbae4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.822401 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.822441 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.822450 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.822461 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lqz7\" (UniqueName: \"kubernetes.io/projected/fe89b6c7-308b-42a8-92a9-da093d6bbae4-kube-api-access-6lqz7\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.822470 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:41 crc kubenswrapper[4795]: I0219 22:52:41.822478 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe89b6c7-308b-42a8-92a9-da093d6bbae4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.191459 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tghht" event={"ID":"fe89b6c7-308b-42a8-92a9-da093d6bbae4","Type":"ContainerDied","Data":"a7cebc2666dd4130e92c6c4d3c68c5b05d24b04e3bc0724f42b292ec6cb3c474"} Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.192092 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7cebc2666dd4130e92c6c4d3c68c5b05d24b04e3bc0724f42b292ec6cb3c474" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.191525 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tghht" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.287021 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-689ff8fbd7-j2v4l"] Feb 19 22:52:42 crc kubenswrapper[4795]: E0219 22:52:42.287502 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" containerName="dnsmasq-dns" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.287525 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" containerName="dnsmasq-dns" Feb 19 22:52:42 crc kubenswrapper[4795]: E0219 22:52:42.287574 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" containerName="init" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.287583 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" containerName="init" Feb 19 22:52:42 crc kubenswrapper[4795]: E0219 22:52:42.287593 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe89b6c7-308b-42a8-92a9-da093d6bbae4" containerName="keystone-bootstrap" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.287603 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe89b6c7-308b-42a8-92a9-da093d6bbae4" containerName="keystone-bootstrap" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.287830 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee8a52c2-f6ad-4b2e-a092-9393dac0f15a" containerName="dnsmasq-dns" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.287875 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe89b6c7-308b-42a8-92a9-da093d6bbae4" containerName="keystone-bootstrap" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.288699 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.290981 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.291224 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.291270 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.291451 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bpdln" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.305715 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-689ff8fbd7-j2v4l"] Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.432933 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh7tz\" (UniqueName: \"kubernetes.io/projected/57c39d61-cab0-49e7-8938-06952896387e-kube-api-access-xh7tz\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.433005 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-config-data\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.433028 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-combined-ca-bundle\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.433047 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-fernet-keys\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.433077 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-scripts\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.433134 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-credential-keys\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.534458 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh7tz\" (UniqueName: \"kubernetes.io/projected/57c39d61-cab0-49e7-8938-06952896387e-kube-api-access-xh7tz\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.534515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-config-data\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.534539 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-combined-ca-bundle\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.534556 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-fernet-keys\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.534587 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-scripts\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.534611 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-credential-keys\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.538881 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-credential-keys\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.539045 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-fernet-keys\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.539605 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-config-data\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.539799 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-combined-ca-bundle\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.540154 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57c39d61-cab0-49e7-8938-06952896387e-scripts\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.551338 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh7tz\" (UniqueName: \"kubernetes.io/projected/57c39d61-cab0-49e7-8938-06952896387e-kube-api-access-xh7tz\") pod \"keystone-689ff8fbd7-j2v4l\" (UID: \"57c39d61-cab0-49e7-8938-06952896387e\") " pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:42 crc kubenswrapper[4795]: I0219 22:52:42.609067 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:43 crc kubenswrapper[4795]: I0219 22:52:43.022075 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-689ff8fbd7-j2v4l"] Feb 19 22:52:43 crc kubenswrapper[4795]: I0219 22:52:43.202942 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-689ff8fbd7-j2v4l" event={"ID":"57c39d61-cab0-49e7-8938-06952896387e","Type":"ContainerStarted","Data":"a5b34a7617681ccb01188f3f8df30993d2c24e886985165554122e6148e7e686"} Feb 19 22:52:44 crc kubenswrapper[4795]: I0219 22:52:44.217689 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-689ff8fbd7-j2v4l" event={"ID":"57c39d61-cab0-49e7-8938-06952896387e","Type":"ContainerStarted","Data":"498e012de73e1306e1a1f950ab4d480f4813c283f654129da15f41fffb0fd674"} Feb 19 22:52:44 crc kubenswrapper[4795]: I0219 22:52:44.218150 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:52:44 crc kubenswrapper[4795]: I0219 22:52:44.246123 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-689ff8fbd7-j2v4l" podStartSLOduration=2.246101869 podStartE2EDuration="2.246101869s" podCreationTimestamp="2026-02-19 22:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:52:44.238245934 +0000 UTC m=+5075.430763798" watchObservedRunningTime="2026-02-19 22:52:44.246101869 +0000 UTC m=+5075.438619733" Feb 19 22:52:53 crc kubenswrapper[4795]: I0219 22:52:53.512297 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:52:53 crc kubenswrapper[4795]: E0219 22:52:53.512933 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:53:06 crc kubenswrapper[4795]: I0219 22:53:06.511458 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:53:06 crc kubenswrapper[4795]: E0219 22:53:06.512127 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:53:14 crc kubenswrapper[4795]: I0219 22:53:14.006826 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-689ff8fbd7-j2v4l" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.150696 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.153578 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.155713 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.156271 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-8rdbp" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.158100 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.176556 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.206584 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sfjd\" (UniqueName: \"kubernetes.io/projected/54e90f84-703c-41b3-85c2-dd4ce9e3a968-kube-api-access-8sfjd\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.206800 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config-secret\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.207034 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.308019 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.308075 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sfjd\" (UniqueName: \"kubernetes.io/projected/54e90f84-703c-41b3-85c2-dd4ce9e3a968-kube-api-access-8sfjd\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.308127 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config-secret\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.309070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.315772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config-secret\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.324457 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sfjd\" (UniqueName: \"kubernetes.io/projected/54e90f84-703c-41b3-85c2-dd4ce9e3a968-kube-api-access-8sfjd\") pod \"openstackclient\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.510882 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:53:18 crc kubenswrapper[4795]: E0219 22:53:18.511667 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.543038 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 22:53:18 crc kubenswrapper[4795]: I0219 22:53:18.973360 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 22:53:19 crc kubenswrapper[4795]: I0219 22:53:19.507783 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"54e90f84-703c-41b3-85c2-dd4ce9e3a968","Type":"ContainerStarted","Data":"3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd"} Feb 19 22:53:19 crc kubenswrapper[4795]: I0219 22:53:19.508415 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"54e90f84-703c-41b3-85c2-dd4ce9e3a968","Type":"ContainerStarted","Data":"c80b0ea28a23506da3a2169be37d63f31f0ce0b86f7443217beacd84fe00e7de"} Feb 19 22:53:19 crc kubenswrapper[4795]: I0219 22:53:19.539904 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.5398829059999999 podStartE2EDuration="1.539882906s" podCreationTimestamp="2026-02-19 22:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:53:19.533880764 +0000 UTC m=+5110.726398668" watchObservedRunningTime="2026-02-19 22:53:19.539882906 +0000 UTC m=+5110.732400780" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.309447 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f8r7g"] Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.315062 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.340614 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8r7g"] Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.448957 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-utilities\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.449139 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-catalog-content\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.449214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frc84\" (UniqueName: \"kubernetes.io/projected/557ec556-8442-4d6a-a634-4fa240dc96dd-kube-api-access-frc84\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.512313 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:53:31 crc kubenswrapper[4795]: E0219 22:53:31.512905 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.550942 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-utilities\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.551055 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-catalog-content\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.551081 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frc84\" (UniqueName: \"kubernetes.io/projected/557ec556-8442-4d6a-a634-4fa240dc96dd-kube-api-access-frc84\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.551451 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-catalog-content\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.551504 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-utilities\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.570690 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frc84\" (UniqueName: \"kubernetes.io/projected/557ec556-8442-4d6a-a634-4fa240dc96dd-kube-api-access-frc84\") pod \"redhat-marketplace-f8r7g\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:31 crc kubenswrapper[4795]: I0219 22:53:31.650583 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:32 crc kubenswrapper[4795]: I0219 22:53:32.102877 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8r7g"] Feb 19 22:53:32 crc kubenswrapper[4795]: W0219 22:53:32.107286 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod557ec556_8442_4d6a_a634_4fa240dc96dd.slice/crio-a6d2f64224f31951b6522ea786610229890b81739479a3ce81f1108e35bc147e WatchSource:0}: Error finding container a6d2f64224f31951b6522ea786610229890b81739479a3ce81f1108e35bc147e: Status 404 returned error can't find the container with id a6d2f64224f31951b6522ea786610229890b81739479a3ce81f1108e35bc147e Feb 19 22:53:32 crc kubenswrapper[4795]: I0219 22:53:32.633599 4795 generic.go:334] "Generic (PLEG): container finished" podID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerID="c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb" exitCode=0 Feb 19 22:53:32 crc kubenswrapper[4795]: I0219 22:53:32.633649 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8r7g" event={"ID":"557ec556-8442-4d6a-a634-4fa240dc96dd","Type":"ContainerDied","Data":"c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb"} Feb 19 22:53:32 crc kubenswrapper[4795]: I0219 22:53:32.633680 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8r7g" event={"ID":"557ec556-8442-4d6a-a634-4fa240dc96dd","Type":"ContainerStarted","Data":"a6d2f64224f31951b6522ea786610229890b81739479a3ce81f1108e35bc147e"} Feb 19 22:53:32 crc kubenswrapper[4795]: I0219 22:53:32.635700 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:53:33 crc kubenswrapper[4795]: I0219 22:53:33.643287 4795 generic.go:334] "Generic (PLEG): container finished" podID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerID="a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7" exitCode=0 Feb 19 22:53:33 crc kubenswrapper[4795]: I0219 22:53:33.643334 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8r7g" event={"ID":"557ec556-8442-4d6a-a634-4fa240dc96dd","Type":"ContainerDied","Data":"a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7"} Feb 19 22:53:34 crc kubenswrapper[4795]: I0219 22:53:34.653315 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8r7g" event={"ID":"557ec556-8442-4d6a-a634-4fa240dc96dd","Type":"ContainerStarted","Data":"4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8"} Feb 19 22:53:34 crc kubenswrapper[4795]: I0219 22:53:34.672825 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f8r7g" podStartSLOduration=2.28759593 podStartE2EDuration="3.672806886s" podCreationTimestamp="2026-02-19 22:53:31 +0000 UTC" firstStartedPulling="2026-02-19 22:53:32.635338904 +0000 UTC m=+5123.827856778" lastFinishedPulling="2026-02-19 22:53:34.02054987 +0000 UTC m=+5125.213067734" observedRunningTime="2026-02-19 22:53:34.671290612 +0000 UTC m=+5125.863808476" watchObservedRunningTime="2026-02-19 22:53:34.672806886 +0000 UTC m=+5125.865324760" Feb 19 22:53:41 crc kubenswrapper[4795]: I0219 22:53:41.651654 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:41 crc kubenswrapper[4795]: I0219 22:53:41.652756 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:41 crc kubenswrapper[4795]: I0219 22:53:41.698063 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:41 crc kubenswrapper[4795]: I0219 22:53:41.754478 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:41 crc kubenswrapper[4795]: I0219 22:53:41.942391 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8r7g"] Feb 19 22:53:43 crc kubenswrapper[4795]: I0219 22:53:43.723236 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f8r7g" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerName="registry-server" containerID="cri-o://4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8" gracePeriod=2 Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.293461 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.420523 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-catalog-content\") pod \"557ec556-8442-4d6a-a634-4fa240dc96dd\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.420613 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frc84\" (UniqueName: \"kubernetes.io/projected/557ec556-8442-4d6a-a634-4fa240dc96dd-kube-api-access-frc84\") pod \"557ec556-8442-4d6a-a634-4fa240dc96dd\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.420713 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-utilities\") pod \"557ec556-8442-4d6a-a634-4fa240dc96dd\" (UID: \"557ec556-8442-4d6a-a634-4fa240dc96dd\") " Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.421888 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-utilities" (OuterVolumeSpecName: "utilities") pod "557ec556-8442-4d6a-a634-4fa240dc96dd" (UID: "557ec556-8442-4d6a-a634-4fa240dc96dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.426446 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/557ec556-8442-4d6a-a634-4fa240dc96dd-kube-api-access-frc84" (OuterVolumeSpecName: "kube-api-access-frc84") pod "557ec556-8442-4d6a-a634-4fa240dc96dd" (UID: "557ec556-8442-4d6a-a634-4fa240dc96dd"). InnerVolumeSpecName "kube-api-access-frc84". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.444148 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "557ec556-8442-4d6a-a634-4fa240dc96dd" (UID: "557ec556-8442-4d6a-a634-4fa240dc96dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.521994 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.522032 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frc84\" (UniqueName: \"kubernetes.io/projected/557ec556-8442-4d6a-a634-4fa240dc96dd-kube-api-access-frc84\") on node \"crc\" DevicePath \"\"" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.522046 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557ec556-8442-4d6a-a634-4fa240dc96dd-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.735553 4795 generic.go:334] "Generic (PLEG): container finished" podID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerID="4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8" exitCode=0 Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.735597 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8r7g" event={"ID":"557ec556-8442-4d6a-a634-4fa240dc96dd","Type":"ContainerDied","Data":"4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8"} Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.735648 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8r7g" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.735660 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8r7g" event={"ID":"557ec556-8442-4d6a-a634-4fa240dc96dd","Type":"ContainerDied","Data":"a6d2f64224f31951b6522ea786610229890b81739479a3ce81f1108e35bc147e"} Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.735690 4795 scope.go:117] "RemoveContainer" containerID="4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.764056 4795 scope.go:117] "RemoveContainer" containerID="a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.781844 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8r7g"] Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.793067 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8r7g"] Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.805835 4795 scope.go:117] "RemoveContainer" containerID="c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.832718 4795 scope.go:117] "RemoveContainer" containerID="4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8" Feb 19 22:53:44 crc kubenswrapper[4795]: E0219 22:53:44.833198 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8\": container with ID starting with 4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8 not found: ID does not exist" containerID="4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.833245 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8"} err="failed to get container status \"4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8\": rpc error: code = NotFound desc = could not find container \"4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8\": container with ID starting with 4623fadc476945889aff24b06d18aa005efb4422a840cf8233dbc8382806d4e8 not found: ID does not exist" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.833276 4795 scope.go:117] "RemoveContainer" containerID="a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7" Feb 19 22:53:44 crc kubenswrapper[4795]: E0219 22:53:44.833720 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7\": container with ID starting with a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7 not found: ID does not exist" containerID="a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.833763 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7"} err="failed to get container status \"a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7\": rpc error: code = NotFound desc = could not find container \"a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7\": container with ID starting with a8ba9c147da3d514e3af879474033da9f8b52540193f19e56451a10a7bd8dda7 not found: ID does not exist" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.833792 4795 scope.go:117] "RemoveContainer" containerID="c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb" Feb 19 22:53:44 crc kubenswrapper[4795]: E0219 22:53:44.834095 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb\": container with ID starting with c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb not found: ID does not exist" containerID="c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb" Feb 19 22:53:44 crc kubenswrapper[4795]: I0219 22:53:44.834147 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb"} err="failed to get container status \"c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb\": rpc error: code = NotFound desc = could not find container \"c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb\": container with ID starting with c231894f928ff6006d3795f32c07693544e63e24b0bda123c750e9a9591a85cb not found: ID does not exist" Feb 19 22:53:45 crc kubenswrapper[4795]: I0219 22:53:45.512398 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:53:45 crc kubenswrapper[4795]: E0219 22:53:45.513382 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:53:45 crc kubenswrapper[4795]: I0219 22:53:45.528567 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" path="/var/lib/kubelet/pods/557ec556-8442-4d6a-a634-4fa240dc96dd/volumes" Feb 19 22:53:58 crc kubenswrapper[4795]: I0219 22:53:58.513471 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:53:58 crc kubenswrapper[4795]: E0219 22:53:58.514361 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:54:13 crc kubenswrapper[4795]: I0219 22:54:13.514343 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:54:13 crc kubenswrapper[4795]: E0219 22:54:13.515043 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:54:24 crc kubenswrapper[4795]: I0219 22:54:24.513388 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:54:24 crc kubenswrapper[4795]: E0219 22:54:24.514373 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:54:36 crc kubenswrapper[4795]: I0219 22:54:36.511365 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:54:36 crc kubenswrapper[4795]: E0219 22:54:36.512250 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:54:49 crc kubenswrapper[4795]: I0219 22:54:49.517338 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:54:49 crc kubenswrapper[4795]: E0219 22:54:49.518030 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.615447 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-v9rbk"] Feb 19 22:55:00 crc kubenswrapper[4795]: E0219 22:55:00.616341 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerName="registry-server" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.616355 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerName="registry-server" Feb 19 22:55:00 crc kubenswrapper[4795]: E0219 22:55:00.616377 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerName="extract-utilities" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.616383 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerName="extract-utilities" Feb 19 22:55:00 crc kubenswrapper[4795]: E0219 22:55:00.616396 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerName="extract-content" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.616402 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerName="extract-content" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.616554 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="557ec556-8442-4d6a-a634-4fa240dc96dd" containerName="registry-server" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.617072 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.625779 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-v9rbk"] Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.701058 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbr6n\" (UniqueName: \"kubernetes.io/projected/c13f05e4-27de-4750-bb9d-008e3a0be0c7-kube-api-access-kbr6n\") pod \"barbican-db-create-v9rbk\" (UID: \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\") " pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.701430 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13f05e4-27de-4750-bb9d-008e3a0be0c7-operator-scripts\") pod \"barbican-db-create-v9rbk\" (UID: \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\") " pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.710977 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b1d9-account-create-update-bfmjz"] Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.712079 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.715125 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.721272 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b1d9-account-create-update-bfmjz"] Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.803191 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4ca6125-46fa-4dd9-8d20-3816b6c09066-operator-scripts\") pod \"barbican-b1d9-account-create-update-bfmjz\" (UID: \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\") " pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.803267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbr6n\" (UniqueName: \"kubernetes.io/projected/c13f05e4-27de-4750-bb9d-008e3a0be0c7-kube-api-access-kbr6n\") pod \"barbican-db-create-v9rbk\" (UID: \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\") " pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.803303 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkpcn\" (UniqueName: \"kubernetes.io/projected/a4ca6125-46fa-4dd9-8d20-3816b6c09066-kube-api-access-nkpcn\") pod \"barbican-b1d9-account-create-update-bfmjz\" (UID: \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\") " pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.803381 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13f05e4-27de-4750-bb9d-008e3a0be0c7-operator-scripts\") pod \"barbican-db-create-v9rbk\" (UID: \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\") " pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.804201 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13f05e4-27de-4750-bb9d-008e3a0be0c7-operator-scripts\") pod \"barbican-db-create-v9rbk\" (UID: \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\") " pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.821191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbr6n\" (UniqueName: \"kubernetes.io/projected/c13f05e4-27de-4750-bb9d-008e3a0be0c7-kube-api-access-kbr6n\") pod \"barbican-db-create-v9rbk\" (UID: \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\") " pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.904628 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4ca6125-46fa-4dd9-8d20-3816b6c09066-operator-scripts\") pod \"barbican-b1d9-account-create-update-bfmjz\" (UID: \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\") " pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.904694 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkpcn\" (UniqueName: \"kubernetes.io/projected/a4ca6125-46fa-4dd9-8d20-3816b6c09066-kube-api-access-nkpcn\") pod \"barbican-b1d9-account-create-update-bfmjz\" (UID: \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\") " pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.905424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4ca6125-46fa-4dd9-8d20-3816b6c09066-operator-scripts\") pod \"barbican-b1d9-account-create-update-bfmjz\" (UID: \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\") " pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.926463 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkpcn\" (UniqueName: \"kubernetes.io/projected/a4ca6125-46fa-4dd9-8d20-3816b6c09066-kube-api-access-nkpcn\") pod \"barbican-b1d9-account-create-update-bfmjz\" (UID: \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\") " pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:00 crc kubenswrapper[4795]: I0219 22:55:00.932602 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:01 crc kubenswrapper[4795]: I0219 22:55:01.027618 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:01 crc kubenswrapper[4795]: I0219 22:55:01.372503 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-v9rbk"] Feb 19 22:55:01 crc kubenswrapper[4795]: I0219 22:55:01.409475 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-v9rbk" event={"ID":"c13f05e4-27de-4750-bb9d-008e3a0be0c7","Type":"ContainerStarted","Data":"7c97af9ccf6f31d23cbe3a3815205b5749e3e039902947753042820a7aade8f7"} Feb 19 22:55:01 crc kubenswrapper[4795]: I0219 22:55:01.463604 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b1d9-account-create-update-bfmjz"] Feb 19 22:55:01 crc kubenswrapper[4795]: W0219 22:55:01.465392 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ca6125_46fa_4dd9_8d20_3816b6c09066.slice/crio-41ebf5dff4b7d3713e2e96a9fc11933327b20a4c76864580a6e60d66ebb7a6b3 WatchSource:0}: Error finding container 41ebf5dff4b7d3713e2e96a9fc11933327b20a4c76864580a6e60d66ebb7a6b3: Status 404 returned error can't find the container with id 41ebf5dff4b7d3713e2e96a9fc11933327b20a4c76864580a6e60d66ebb7a6b3 Feb 19 22:55:02 crc kubenswrapper[4795]: I0219 22:55:02.419906 4795 generic.go:334] "Generic (PLEG): container finished" podID="a4ca6125-46fa-4dd9-8d20-3816b6c09066" containerID="88b5a89b19e9675d8f8f4f6be28cd30648da8baae5b54e90d50ee586416168ce" exitCode=0 Feb 19 22:55:02 crc kubenswrapper[4795]: I0219 22:55:02.419969 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1d9-account-create-update-bfmjz" event={"ID":"a4ca6125-46fa-4dd9-8d20-3816b6c09066","Type":"ContainerDied","Data":"88b5a89b19e9675d8f8f4f6be28cd30648da8baae5b54e90d50ee586416168ce"} Feb 19 22:55:02 crc kubenswrapper[4795]: I0219 22:55:02.419996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1d9-account-create-update-bfmjz" event={"ID":"a4ca6125-46fa-4dd9-8d20-3816b6c09066","Type":"ContainerStarted","Data":"41ebf5dff4b7d3713e2e96a9fc11933327b20a4c76864580a6e60d66ebb7a6b3"} Feb 19 22:55:02 crc kubenswrapper[4795]: I0219 22:55:02.421061 4795 generic.go:334] "Generic (PLEG): container finished" podID="c13f05e4-27de-4750-bb9d-008e3a0be0c7" containerID="c70a8662daa881e0007310348920cbd44bed7e794700942de323e6f34a2f57fc" exitCode=0 Feb 19 22:55:02 crc kubenswrapper[4795]: I0219 22:55:02.421079 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-v9rbk" event={"ID":"c13f05e4-27de-4750-bb9d-008e3a0be0c7","Type":"ContainerDied","Data":"c70a8662daa881e0007310348920cbd44bed7e794700942de323e6f34a2f57fc"} Feb 19 22:55:02 crc kubenswrapper[4795]: I0219 22:55:02.511822 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:55:02 crc kubenswrapper[4795]: E0219 22:55:02.512152 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:55:03 crc kubenswrapper[4795]: I0219 22:55:03.824558 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:03 crc kubenswrapper[4795]: I0219 22:55:03.910618 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:03 crc kubenswrapper[4795]: I0219 22:55:03.955801 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbr6n\" (UniqueName: \"kubernetes.io/projected/c13f05e4-27de-4750-bb9d-008e3a0be0c7-kube-api-access-kbr6n\") pod \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\" (UID: \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\") " Feb 19 22:55:03 crc kubenswrapper[4795]: I0219 22:55:03.955845 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13f05e4-27de-4750-bb9d-008e3a0be0c7-operator-scripts\") pod \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\" (UID: \"c13f05e4-27de-4750-bb9d-008e3a0be0c7\") " Feb 19 22:55:03 crc kubenswrapper[4795]: I0219 22:55:03.956771 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c13f05e4-27de-4750-bb9d-008e3a0be0c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c13f05e4-27de-4750-bb9d-008e3a0be0c7" (UID: "c13f05e4-27de-4750-bb9d-008e3a0be0c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:55:03 crc kubenswrapper[4795]: I0219 22:55:03.966569 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13f05e4-27de-4750-bb9d-008e3a0be0c7-kube-api-access-kbr6n" (OuterVolumeSpecName: "kube-api-access-kbr6n") pod "c13f05e4-27de-4750-bb9d-008e3a0be0c7" (UID: "c13f05e4-27de-4750-bb9d-008e3a0be0c7"). InnerVolumeSpecName "kube-api-access-kbr6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.057251 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkpcn\" (UniqueName: \"kubernetes.io/projected/a4ca6125-46fa-4dd9-8d20-3816b6c09066-kube-api-access-nkpcn\") pod \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\" (UID: \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\") " Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.057415 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4ca6125-46fa-4dd9-8d20-3816b6c09066-operator-scripts\") pod \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\" (UID: \"a4ca6125-46fa-4dd9-8d20-3816b6c09066\") " Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.057741 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbr6n\" (UniqueName: \"kubernetes.io/projected/c13f05e4-27de-4750-bb9d-008e3a0be0c7-kube-api-access-kbr6n\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.057761 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13f05e4-27de-4750-bb9d-008e3a0be0c7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.058218 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ca6125-46fa-4dd9-8d20-3816b6c09066-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4ca6125-46fa-4dd9-8d20-3816b6c09066" (UID: "a4ca6125-46fa-4dd9-8d20-3816b6c09066"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.060449 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ca6125-46fa-4dd9-8d20-3816b6c09066-kube-api-access-nkpcn" (OuterVolumeSpecName: "kube-api-access-nkpcn") pod "a4ca6125-46fa-4dd9-8d20-3816b6c09066" (UID: "a4ca6125-46fa-4dd9-8d20-3816b6c09066"). InnerVolumeSpecName "kube-api-access-nkpcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.159005 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4ca6125-46fa-4dd9-8d20-3816b6c09066-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.159067 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkpcn\" (UniqueName: \"kubernetes.io/projected/a4ca6125-46fa-4dd9-8d20-3816b6c09066-kube-api-access-nkpcn\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.439712 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-v9rbk" event={"ID":"c13f05e4-27de-4750-bb9d-008e3a0be0c7","Type":"ContainerDied","Data":"7c97af9ccf6f31d23cbe3a3815205b5749e3e039902947753042820a7aade8f7"} Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.439749 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-v9rbk" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.439757 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c97af9ccf6f31d23cbe3a3815205b5749e3e039902947753042820a7aade8f7" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.441217 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1d9-account-create-update-bfmjz" event={"ID":"a4ca6125-46fa-4dd9-8d20-3816b6c09066","Type":"ContainerDied","Data":"41ebf5dff4b7d3713e2e96a9fc11933327b20a4c76864580a6e60d66ebb7a6b3"} Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.441299 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1d9-account-create-update-bfmjz" Feb 19 22:55:04 crc kubenswrapper[4795]: I0219 22:55:04.441314 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41ebf5dff4b7d3713e2e96a9fc11933327b20a4c76864580a6e60d66ebb7a6b3" Feb 19 22:55:04 crc kubenswrapper[4795]: E0219 22:55:04.494542 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13f05e4_27de_4750_bb9d_008e3a0be0c7.slice/crio-7c97af9ccf6f31d23cbe3a3815205b5749e3e039902947753042820a7aade8f7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13f05e4_27de_4750_bb9d_008e3a0be0c7.slice\": RecentStats: unable to find data in memory cache]" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.060682 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-vrr5x"] Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.067360 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-vrr5x"] Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.524382 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfa8dda8-f620-4331-8909-b10784ceeab8" path="/var/lib/kubelet/pods/cfa8dda8-f620-4331-8909-b10784ceeab8/volumes" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.971606 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-f5h94"] Feb 19 22:55:05 crc kubenswrapper[4795]: E0219 22:55:05.972013 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13f05e4-27de-4750-bb9d-008e3a0be0c7" containerName="mariadb-database-create" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.972030 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13f05e4-27de-4750-bb9d-008e3a0be0c7" containerName="mariadb-database-create" Feb 19 22:55:05 crc kubenswrapper[4795]: E0219 22:55:05.972049 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ca6125-46fa-4dd9-8d20-3816b6c09066" containerName="mariadb-account-create-update" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.972058 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ca6125-46fa-4dd9-8d20-3816b6c09066" containerName="mariadb-account-create-update" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.972329 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ca6125-46fa-4dd9-8d20-3816b6c09066" containerName="mariadb-account-create-update" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.972361 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13f05e4-27de-4750-bb9d-008e3a0be0c7" containerName="mariadb-database-create" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.972964 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.976530 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2lvws" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.977877 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 22:55:05 crc kubenswrapper[4795]: I0219 22:55:05.984339 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-f5h94"] Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.087895 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-combined-ca-bundle\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.088749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-db-sync-config-data\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.088919 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-579zw\" (UniqueName: \"kubernetes.io/projected/ace73a97-1b52-4187-a035-df7a08266bab-kube-api-access-579zw\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.190667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-db-sync-config-data\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.191872 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-579zw\" (UniqueName: \"kubernetes.io/projected/ace73a97-1b52-4187-a035-df7a08266bab-kube-api-access-579zw\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.192072 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-combined-ca-bundle\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.197265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-db-sync-config-data\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.197351 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-combined-ca-bundle\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.208668 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-579zw\" (UniqueName: \"kubernetes.io/projected/ace73a97-1b52-4187-a035-df7a08266bab-kube-api-access-579zw\") pod \"barbican-db-sync-f5h94\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.337316 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:06 crc kubenswrapper[4795]: I0219 22:55:06.813659 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-f5h94"] Feb 19 22:55:07 crc kubenswrapper[4795]: I0219 22:55:07.461958 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f5h94" event={"ID":"ace73a97-1b52-4187-a035-df7a08266bab","Type":"ContainerStarted","Data":"1ad516f68056dfaa3dffe45049adde7607d436761c153d38593aeeda7b4036af"} Feb 19 22:55:07 crc kubenswrapper[4795]: I0219 22:55:07.462317 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f5h94" event={"ID":"ace73a97-1b52-4187-a035-df7a08266bab","Type":"ContainerStarted","Data":"5ac57d6447f66088480258c408e45417d8d396373de56641f2bcaa0b12a4827f"} Feb 19 22:55:07 crc kubenswrapper[4795]: I0219 22:55:07.492382 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-f5h94" podStartSLOduration=2.492358384 podStartE2EDuration="2.492358384s" podCreationTimestamp="2026-02-19 22:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:07.484603252 +0000 UTC m=+5218.677121126" watchObservedRunningTime="2026-02-19 22:55:07.492358384 +0000 UTC m=+5218.684876288" Feb 19 22:55:08 crc kubenswrapper[4795]: I0219 22:55:08.474481 4795 generic.go:334] "Generic (PLEG): container finished" podID="ace73a97-1b52-4187-a035-df7a08266bab" containerID="1ad516f68056dfaa3dffe45049adde7607d436761c153d38593aeeda7b4036af" exitCode=0 Feb 19 22:55:08 crc kubenswrapper[4795]: I0219 22:55:08.474571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f5h94" event={"ID":"ace73a97-1b52-4187-a035-df7a08266bab","Type":"ContainerDied","Data":"1ad516f68056dfaa3dffe45049adde7607d436761c153d38593aeeda7b4036af"} Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.809610 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.860677 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-db-sync-config-data\") pod \"ace73a97-1b52-4187-a035-df7a08266bab\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.860750 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-combined-ca-bundle\") pod \"ace73a97-1b52-4187-a035-df7a08266bab\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.860964 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-579zw\" (UniqueName: \"kubernetes.io/projected/ace73a97-1b52-4187-a035-df7a08266bab-kube-api-access-579zw\") pod \"ace73a97-1b52-4187-a035-df7a08266bab\" (UID: \"ace73a97-1b52-4187-a035-df7a08266bab\") " Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.866915 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace73a97-1b52-4187-a035-df7a08266bab-kube-api-access-579zw" (OuterVolumeSpecName: "kube-api-access-579zw") pod "ace73a97-1b52-4187-a035-df7a08266bab" (UID: "ace73a97-1b52-4187-a035-df7a08266bab"). InnerVolumeSpecName "kube-api-access-579zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.867810 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ace73a97-1b52-4187-a035-df7a08266bab" (UID: "ace73a97-1b52-4187-a035-df7a08266bab"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.884915 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ace73a97-1b52-4187-a035-df7a08266bab" (UID: "ace73a97-1b52-4187-a035-df7a08266bab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.962712 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.962752 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace73a97-1b52-4187-a035-df7a08266bab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:09 crc kubenswrapper[4795]: I0219 22:55:09.962767 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-579zw\" (UniqueName: \"kubernetes.io/projected/ace73a97-1b52-4187-a035-df7a08266bab-kube-api-access-579zw\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.491002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f5h94" event={"ID":"ace73a97-1b52-4187-a035-df7a08266bab","Type":"ContainerDied","Data":"5ac57d6447f66088480258c408e45417d8d396373de56641f2bcaa0b12a4827f"} Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.491047 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ac57d6447f66088480258c408e45417d8d396373de56641f2bcaa0b12a4827f" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.491052 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f5h94" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.716652 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7c6d9dfdbf-zg9wc"] Feb 19 22:55:10 crc kubenswrapper[4795]: E0219 22:55:10.717044 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace73a97-1b52-4187-a035-df7a08266bab" containerName="barbican-db-sync" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.717067 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace73a97-1b52-4187-a035-df7a08266bab" containerName="barbican-db-sync" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.717319 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace73a97-1b52-4187-a035-df7a08266bab" containerName="barbican-db-sync" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.718323 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.720853 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.721151 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.722296 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2lvws" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.764002 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr"] Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.768472 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.772904 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.775966 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e6a3af4-fd31-411b-833c-5a39501f5d63-logs\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.776039 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-config-data\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.776099 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlcxk\" (UniqueName: \"kubernetes.io/projected/3e6a3af4-fd31-411b-833c-5a39501f5d63-kube-api-access-zlcxk\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.776133 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-config-data-custom\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.776187 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-combined-ca-bundle\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.814881 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr"] Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.822140 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c6d9dfdbf-zg9wc"] Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.853384 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dc65d4d5f-tt9nm"] Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.858348 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.878753 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlcxk\" (UniqueName: \"kubernetes.io/projected/3e6a3af4-fd31-411b-833c-5a39501f5d63-kube-api-access-zlcxk\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.878891 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-config-data-custom\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.878984 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-config-data-custom\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.879049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-combined-ca-bundle\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.879124 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh8xg\" (UniqueName: \"kubernetes.io/projected/5bb2f008-145f-4fc9-9d51-065874ab1b1e-kube-api-access-dh8xg\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.879154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-combined-ca-bundle\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.879229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-config-data\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.879298 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e6a3af4-fd31-411b-833c-5a39501f5d63-logs\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.879386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-config-data\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.879441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bb2f008-145f-4fc9-9d51-065874ab1b1e-logs\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.879918 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc65d4d5f-tt9nm"] Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.880478 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e6a3af4-fd31-411b-833c-5a39501f5d63-logs\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.884075 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-config-data-custom\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.887676 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-config-data\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.914116 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6a3af4-fd31-411b-833c-5a39501f5d63-combined-ca-bundle\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.918445 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlcxk\" (UniqueName: \"kubernetes.io/projected/3e6a3af4-fd31-411b-833c-5a39501f5d63-kube-api-access-zlcxk\") pod \"barbican-worker-7c6d9dfdbf-zg9wc\" (UID: \"3e6a3af4-fd31-411b-833c-5a39501f5d63\") " pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.934274 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-57b58f479d-8dz8t"] Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.935946 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.939077 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.956562 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57b58f479d-8dz8t"] Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.980913 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh8xg\" (UniqueName: \"kubernetes.io/projected/5bb2f008-145f-4fc9-9d51-065874ab1b1e-kube-api-access-dh8xg\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.981204 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-combined-ca-bundle\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.981345 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-config-data\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.981461 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwwxp\" (UniqueName: \"kubernetes.io/projected/0f5b30f1-1278-4376-b1bc-6e72def4d494-kube-api-access-bwwxp\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.981606 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.981727 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-config-data\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.981833 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-dns-svc\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.981942 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bb2f008-145f-4fc9-9d51-065874ab1b1e-logs\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.982079 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-combined-ca-bundle\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.982238 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-config\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.982375 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-config-data-custom\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.982504 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30f7c03f-5289-48c5-987e-b808897adc6d-logs\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.982610 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgxb6\" (UniqueName: \"kubernetes.io/projected/30f7c03f-5289-48c5-987e-b808897adc6d-kube-api-access-lgxb6\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.982752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-config-data-custom\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.983315 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.985807 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bb2f008-145f-4fc9-9d51-065874ab1b1e-logs\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.986526 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-combined-ca-bundle\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.987666 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-config-data\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:10 crc kubenswrapper[4795]: I0219 22:55:10.995428 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bb2f008-145f-4fc9-9d51-065874ab1b1e-config-data-custom\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.000896 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh8xg\" (UniqueName: \"kubernetes.io/projected/5bb2f008-145f-4fc9-9d51-065874ab1b1e-kube-api-access-dh8xg\") pod \"barbican-keystone-listener-6b9b47c4f6-2kzbr\" (UID: \"5bb2f008-145f-4fc9-9d51-065874ab1b1e\") " pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.035296 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085509 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-config-data\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085557 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-dns-svc\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085638 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-combined-ca-bundle\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085717 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-config\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085788 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30f7c03f-5289-48c5-987e-b808897adc6d-logs\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085837 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgxb6\" (UniqueName: \"kubernetes.io/projected/30f7c03f-5289-48c5-987e-b808897adc6d-kube-api-access-lgxb6\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085858 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-config-data-custom\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085912 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.085960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwwxp\" (UniqueName: \"kubernetes.io/projected/0f5b30f1-1278-4376-b1bc-6e72def4d494-kube-api-access-bwwxp\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.086038 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.089151 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-dns-svc\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.089884 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.090582 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30f7c03f-5289-48c5-987e-b808897adc6d-logs\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.090863 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-config\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.093988 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-config-data-custom\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.094115 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.100896 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-config-data\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.102742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f7c03f-5289-48c5-987e-b808897adc6d-combined-ca-bundle\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.114849 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwwxp\" (UniqueName: \"kubernetes.io/projected/0f5b30f1-1278-4376-b1bc-6e72def4d494-kube-api-access-bwwxp\") pod \"dnsmasq-dns-5dc65d4d5f-tt9nm\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.115423 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.116405 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgxb6\" (UniqueName: \"kubernetes.io/projected/30f7c03f-5289-48c5-987e-b808897adc6d-kube-api-access-lgxb6\") pod \"barbican-api-57b58f479d-8dz8t\" (UID: \"30f7c03f-5289-48c5-987e-b808897adc6d\") " pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.262784 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.350093 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.573038 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c6d9dfdbf-zg9wc"] Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.635369 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc65d4d5f-tt9nm"] Feb 19 22:55:11 crc kubenswrapper[4795]: W0219 22:55:11.661300 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f5b30f1_1278_4376_b1bc_6e72def4d494.slice/crio-0ababbb8a0c165fe874ce96e1fb589c25f4808501b4ec3a8ba5a072fa951fb3d WatchSource:0}: Error finding container 0ababbb8a0c165fe874ce96e1fb589c25f4808501b4ec3a8ba5a072fa951fb3d: Status 404 returned error can't find the container with id 0ababbb8a0c165fe874ce96e1fb589c25f4808501b4ec3a8ba5a072fa951fb3d Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.684714 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr"] Feb 19 22:55:11 crc kubenswrapper[4795]: I0219 22:55:11.800101 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57b58f479d-8dz8t"] Feb 19 22:55:11 crc kubenswrapper[4795]: W0219 22:55:11.812789 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30f7c03f_5289_48c5_987e_b808897adc6d.slice/crio-b91e63e8606c45be88dc366dae5101d108ea95439a06796210749fc2fd80172f WatchSource:0}: Error finding container b91e63e8606c45be88dc366dae5101d108ea95439a06796210749fc2fd80172f: Status 404 returned error can't find the container with id b91e63e8606c45be88dc366dae5101d108ea95439a06796210749fc2fd80172f Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.533428 4795 generic.go:334] "Generic (PLEG): container finished" podID="0f5b30f1-1278-4376-b1bc-6e72def4d494" containerID="da78b2ce5ad3acba003da6948c0acb827331e556f914cb3178cd5862028563a8" exitCode=0 Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.533739 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" event={"ID":"0f5b30f1-1278-4376-b1bc-6e72def4d494","Type":"ContainerDied","Data":"da78b2ce5ad3acba003da6948c0acb827331e556f914cb3178cd5862028563a8"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.533766 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" event={"ID":"0f5b30f1-1278-4376-b1bc-6e72def4d494","Type":"ContainerStarted","Data":"0ababbb8a0c165fe874ce96e1fb589c25f4808501b4ec3a8ba5a072fa951fb3d"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.576361 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" event={"ID":"3e6a3af4-fd31-411b-833c-5a39501f5d63","Type":"ContainerStarted","Data":"3f7b90f71568f9f1e2f76a73b0760f38eef669f889318b4bceecc96cfe93c6f4"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.576401 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" event={"ID":"3e6a3af4-fd31-411b-833c-5a39501f5d63","Type":"ContainerStarted","Data":"0e80fadd20836711ca4dabb497dae8bdccfbae769dcdfbca41342b3a99687b54"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.576410 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" event={"ID":"3e6a3af4-fd31-411b-833c-5a39501f5d63","Type":"ContainerStarted","Data":"0d7dbcf09bbce07f0bc9745d4203787115390bb3d7e0d026295e9a5beffcce09"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.599371 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57b58f479d-8dz8t" event={"ID":"30f7c03f-5289-48c5-987e-b808897adc6d","Type":"ContainerStarted","Data":"ce4262fe2a9a2ead03e6081080511b9d9fac63d4f8a914df063115de337c815e"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.599413 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57b58f479d-8dz8t" event={"ID":"30f7c03f-5289-48c5-987e-b808897adc6d","Type":"ContainerStarted","Data":"d23dea4efee1caf9050acb1dd78f0134a6b93ce46cf468a88179d34fc0242e76"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.599426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57b58f479d-8dz8t" event={"ID":"30f7c03f-5289-48c5-987e-b808897adc6d","Type":"ContainerStarted","Data":"b91e63e8606c45be88dc366dae5101d108ea95439a06796210749fc2fd80172f"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.600118 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.600140 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.605187 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" event={"ID":"5bb2f008-145f-4fc9-9d51-065874ab1b1e","Type":"ContainerStarted","Data":"395dacc5c3c5826c0ae5c0f3234a36b11f81cc6adc9ec428952c743cc6f7deff"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.605229 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" event={"ID":"5bb2f008-145f-4fc9-9d51-065874ab1b1e","Type":"ContainerStarted","Data":"36e0fc8410e757037d278e96ae40ef00c2855d84be871f41f02f19cb03dc8817"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.605241 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" event={"ID":"5bb2f008-145f-4fc9-9d51-065874ab1b1e","Type":"ContainerStarted","Data":"14c6bafe81a7616551ef14573a02ba392e55faebb3797b3afbaf98c0be1d5e12"} Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.611517 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7c6d9dfdbf-zg9wc" podStartSLOduration=2.611499639 podStartE2EDuration="2.611499639s" podCreationTimestamp="2026-02-19 22:55:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:12.602437989 +0000 UTC m=+5223.794955853" watchObservedRunningTime="2026-02-19 22:55:12.611499639 +0000 UTC m=+5223.804017503" Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.635183 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-57b58f479d-8dz8t" podStartSLOduration=2.635139977 podStartE2EDuration="2.635139977s" podCreationTimestamp="2026-02-19 22:55:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:12.617753358 +0000 UTC m=+5223.810271222" watchObservedRunningTime="2026-02-19 22:55:12.635139977 +0000 UTC m=+5223.827657861" Feb 19 22:55:12 crc kubenswrapper[4795]: I0219 22:55:12.647961 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6b9b47c4f6-2kzbr" podStartSLOduration=2.647940284 podStartE2EDuration="2.647940284s" podCreationTimestamp="2026-02-19 22:55:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:12.639965165 +0000 UTC m=+5223.832483029" watchObservedRunningTime="2026-02-19 22:55:12.647940284 +0000 UTC m=+5223.840458148" Feb 19 22:55:13 crc kubenswrapper[4795]: I0219 22:55:13.614036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" event={"ID":"0f5b30f1-1278-4376-b1bc-6e72def4d494","Type":"ContainerStarted","Data":"be2f6f4f8f1c84b00fee1744df7613e5b2cb7c534a7ea1884cb10ab8da51e6b9"} Feb 19 22:55:13 crc kubenswrapper[4795]: I0219 22:55:13.630971 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" podStartSLOduration=3.63095722 podStartE2EDuration="3.63095722s" podCreationTimestamp="2026-02-19 22:55:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:13.628285743 +0000 UTC m=+5224.820803607" watchObservedRunningTime="2026-02-19 22:55:13.63095722 +0000 UTC m=+5224.823475084" Feb 19 22:55:14 crc kubenswrapper[4795]: I0219 22:55:14.623356 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:15 crc kubenswrapper[4795]: I0219 22:55:15.265879 4795 scope.go:117] "RemoveContainer" containerID="595d4f05c4b7ec834570db4adf844a0fca41d1bed50151677fc612dd1b0457bc" Feb 19 22:55:15 crc kubenswrapper[4795]: I0219 22:55:15.511198 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:55:15 crc kubenswrapper[4795]: E0219 22:55:15.511576 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.265340 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.327644 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cbd7f9ccc-v47nm"] Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.329255 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" podUID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" containerName="dnsmasq-dns" containerID="cri-o://1e453cedb2505c25dd86ba43beea6d20a98c7545840d561df3aaa00ec8232701" gracePeriod=10 Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.702095 4795 generic.go:334] "Generic (PLEG): container finished" podID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" containerID="1e453cedb2505c25dd86ba43beea6d20a98c7545840d561df3aaa00ec8232701" exitCode=0 Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.702138 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" event={"ID":"74bde2c2-542d-4473-8a2d-4276ef12f1a1","Type":"ContainerDied","Data":"1e453cedb2505c25dd86ba43beea6d20a98c7545840d561df3aaa00ec8232701"} Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.830750 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.989762 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-config\") pod \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.990059 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g49lp\" (UniqueName: \"kubernetes.io/projected/74bde2c2-542d-4473-8a2d-4276ef12f1a1-kube-api-access-g49lp\") pod \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.990116 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-nb\") pod \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.990235 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-sb\") pod \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.990265 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-dns-svc\") pod \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\" (UID: \"74bde2c2-542d-4473-8a2d-4276ef12f1a1\") " Feb 19 22:55:21 crc kubenswrapper[4795]: I0219 22:55:21.999294 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74bde2c2-542d-4473-8a2d-4276ef12f1a1-kube-api-access-g49lp" (OuterVolumeSpecName: "kube-api-access-g49lp") pod "74bde2c2-542d-4473-8a2d-4276ef12f1a1" (UID: "74bde2c2-542d-4473-8a2d-4276ef12f1a1"). InnerVolumeSpecName "kube-api-access-g49lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.034402 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "74bde2c2-542d-4473-8a2d-4276ef12f1a1" (UID: "74bde2c2-542d-4473-8a2d-4276ef12f1a1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.037846 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-config" (OuterVolumeSpecName: "config") pod "74bde2c2-542d-4473-8a2d-4276ef12f1a1" (UID: "74bde2c2-542d-4473-8a2d-4276ef12f1a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.052168 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "74bde2c2-542d-4473-8a2d-4276ef12f1a1" (UID: "74bde2c2-542d-4473-8a2d-4276ef12f1a1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.064751 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "74bde2c2-542d-4473-8a2d-4276ef12f1a1" (UID: "74bde2c2-542d-4473-8a2d-4276ef12f1a1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.093560 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.093604 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g49lp\" (UniqueName: \"kubernetes.io/projected/74bde2c2-542d-4473-8a2d-4276ef12f1a1-kube-api-access-g49lp\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.093619 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.093632 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.093644 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74bde2c2-542d-4473-8a2d-4276ef12f1a1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.713071 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" event={"ID":"74bde2c2-542d-4473-8a2d-4276ef12f1a1","Type":"ContainerDied","Data":"bf13f0e9bcad39a75b168c83b41452ae37c85cfd3c851a7d568e51da749b6e4f"} Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.713117 4795 scope.go:117] "RemoveContainer" containerID="1e453cedb2505c25dd86ba43beea6d20a98c7545840d561df3aaa00ec8232701" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.713272 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cbd7f9ccc-v47nm" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.754849 4795 scope.go:117] "RemoveContainer" containerID="16b74941371be4fa54bab7807a1264eec35a2ff59fcbcca048e96bbfbf300be4" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.755853 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cbd7f9ccc-v47nm"] Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.779685 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cbd7f9ccc-v47nm"] Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.831984 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:22 crc kubenswrapper[4795]: I0219 22:55:22.856677 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57b58f479d-8dz8t" Feb 19 22:55:23 crc kubenswrapper[4795]: I0219 22:55:23.528296 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" path="/var/lib/kubelet/pods/74bde2c2-542d-4473-8a2d-4276ef12f1a1/volumes" Feb 19 22:55:28 crc kubenswrapper[4795]: I0219 22:55:28.512018 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:55:28 crc kubenswrapper[4795]: E0219 22:55:28.512860 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.559383 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-xbwb6"] Feb 19 22:55:36 crc kubenswrapper[4795]: E0219 22:55:36.560247 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" containerName="dnsmasq-dns" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.560260 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" containerName="dnsmasq-dns" Feb 19 22:55:36 crc kubenswrapper[4795]: E0219 22:55:36.560270 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" containerName="init" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.560276 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" containerName="init" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.560430 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="74bde2c2-542d-4473-8a2d-4276ef12f1a1" containerName="dnsmasq-dns" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.561007 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.609008 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xbwb6"] Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.664512 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f69d-account-create-update-gbq6r"] Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.666468 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.668773 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.675498 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f69d-account-create-update-gbq6r"] Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.750188 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl2rd\" (UniqueName: \"kubernetes.io/projected/c0369c6f-517b-44b8-968a-a3408c6044d6-kube-api-access-kl2rd\") pod \"neutron-db-create-xbwb6\" (UID: \"c0369c6f-517b-44b8-968a-a3408c6044d6\") " pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.750245 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0369c6f-517b-44b8-968a-a3408c6044d6-operator-scripts\") pod \"neutron-db-create-xbwb6\" (UID: \"c0369c6f-517b-44b8-968a-a3408c6044d6\") " pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.852217 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bks4\" (UniqueName: \"kubernetes.io/projected/635044d2-10e8-457c-b03e-9507a500c7fe-kube-api-access-4bks4\") pod \"neutron-f69d-account-create-update-gbq6r\" (UID: \"635044d2-10e8-457c-b03e-9507a500c7fe\") " pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.852328 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/635044d2-10e8-457c-b03e-9507a500c7fe-operator-scripts\") pod \"neutron-f69d-account-create-update-gbq6r\" (UID: \"635044d2-10e8-457c-b03e-9507a500c7fe\") " pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.852367 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl2rd\" (UniqueName: \"kubernetes.io/projected/c0369c6f-517b-44b8-968a-a3408c6044d6-kube-api-access-kl2rd\") pod \"neutron-db-create-xbwb6\" (UID: \"c0369c6f-517b-44b8-968a-a3408c6044d6\") " pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.852389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0369c6f-517b-44b8-968a-a3408c6044d6-operator-scripts\") pod \"neutron-db-create-xbwb6\" (UID: \"c0369c6f-517b-44b8-968a-a3408c6044d6\") " pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.853056 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0369c6f-517b-44b8-968a-a3408c6044d6-operator-scripts\") pod \"neutron-db-create-xbwb6\" (UID: \"c0369c6f-517b-44b8-968a-a3408c6044d6\") " pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.870786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl2rd\" (UniqueName: \"kubernetes.io/projected/c0369c6f-517b-44b8-968a-a3408c6044d6-kube-api-access-kl2rd\") pod \"neutron-db-create-xbwb6\" (UID: \"c0369c6f-517b-44b8-968a-a3408c6044d6\") " pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.882045 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.953460 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bks4\" (UniqueName: \"kubernetes.io/projected/635044d2-10e8-457c-b03e-9507a500c7fe-kube-api-access-4bks4\") pod \"neutron-f69d-account-create-update-gbq6r\" (UID: \"635044d2-10e8-457c-b03e-9507a500c7fe\") " pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.953569 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/635044d2-10e8-457c-b03e-9507a500c7fe-operator-scripts\") pod \"neutron-f69d-account-create-update-gbq6r\" (UID: \"635044d2-10e8-457c-b03e-9507a500c7fe\") " pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.954462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/635044d2-10e8-457c-b03e-9507a500c7fe-operator-scripts\") pod \"neutron-f69d-account-create-update-gbq6r\" (UID: \"635044d2-10e8-457c-b03e-9507a500c7fe\") " pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.977508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bks4\" (UniqueName: \"kubernetes.io/projected/635044d2-10e8-457c-b03e-9507a500c7fe-kube-api-access-4bks4\") pod \"neutron-f69d-account-create-update-gbq6r\" (UID: \"635044d2-10e8-457c-b03e-9507a500c7fe\") " pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:36 crc kubenswrapper[4795]: I0219 22:55:36.984598 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:37 crc kubenswrapper[4795]: I0219 22:55:37.412937 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xbwb6"] Feb 19 22:55:37 crc kubenswrapper[4795]: I0219 22:55:37.566343 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f69d-account-create-update-gbq6r"] Feb 19 22:55:37 crc kubenswrapper[4795]: W0219 22:55:37.567812 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod635044d2_10e8_457c_b03e_9507a500c7fe.slice/crio-2fa48bc04516a1246fcfd55e60c6c0e0408ad4f81c112f1c19ad75fdbaeda045 WatchSource:0}: Error finding container 2fa48bc04516a1246fcfd55e60c6c0e0408ad4f81c112f1c19ad75fdbaeda045: Status 404 returned error can't find the container with id 2fa48bc04516a1246fcfd55e60c6c0e0408ad4f81c112f1c19ad75fdbaeda045 Feb 19 22:55:37 crc kubenswrapper[4795]: I0219 22:55:37.821981 4795 generic.go:334] "Generic (PLEG): container finished" podID="c0369c6f-517b-44b8-968a-a3408c6044d6" containerID="82b2542be6e058170f19505c42c4114714e88b52257f1c0e99664dda5eb05f78" exitCode=0 Feb 19 22:55:37 crc kubenswrapper[4795]: I0219 22:55:37.822046 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xbwb6" event={"ID":"c0369c6f-517b-44b8-968a-a3408c6044d6","Type":"ContainerDied","Data":"82b2542be6e058170f19505c42c4114714e88b52257f1c0e99664dda5eb05f78"} Feb 19 22:55:37 crc kubenswrapper[4795]: I0219 22:55:37.822072 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xbwb6" event={"ID":"c0369c6f-517b-44b8-968a-a3408c6044d6","Type":"ContainerStarted","Data":"798c223eb760c97067b30610ba4e1b170b3df8ec61f6c7acc66d63b4565f14de"} Feb 19 22:55:37 crc kubenswrapper[4795]: I0219 22:55:37.823915 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f69d-account-create-update-gbq6r" event={"ID":"635044d2-10e8-457c-b03e-9507a500c7fe","Type":"ContainerStarted","Data":"247acf34bea1d6df95accdf51f9a624b45c32153a0b9fa97bc69d2358a9601e8"} Feb 19 22:55:37 crc kubenswrapper[4795]: I0219 22:55:37.823958 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f69d-account-create-update-gbq6r" event={"ID":"635044d2-10e8-457c-b03e-9507a500c7fe","Type":"ContainerStarted","Data":"2fa48bc04516a1246fcfd55e60c6c0e0408ad4f81c112f1c19ad75fdbaeda045"} Feb 19 22:55:37 crc kubenswrapper[4795]: I0219 22:55:37.856836 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f69d-account-create-update-gbq6r" podStartSLOduration=1.856816596 podStartE2EDuration="1.856816596s" podCreationTimestamp="2026-02-19 22:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:37.848721924 +0000 UTC m=+5249.041239788" watchObservedRunningTime="2026-02-19 22:55:37.856816596 +0000 UTC m=+5249.049334460" Feb 19 22:55:38 crc kubenswrapper[4795]: I0219 22:55:38.836086 4795 generic.go:334] "Generic (PLEG): container finished" podID="635044d2-10e8-457c-b03e-9507a500c7fe" containerID="247acf34bea1d6df95accdf51f9a624b45c32153a0b9fa97bc69d2358a9601e8" exitCode=0 Feb 19 22:55:38 crc kubenswrapper[4795]: I0219 22:55:38.836133 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f69d-account-create-update-gbq6r" event={"ID":"635044d2-10e8-457c-b03e-9507a500c7fe","Type":"ContainerDied","Data":"247acf34bea1d6df95accdf51f9a624b45c32153a0b9fa97bc69d2358a9601e8"} Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.195690 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.294432 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0369c6f-517b-44b8-968a-a3408c6044d6-operator-scripts\") pod \"c0369c6f-517b-44b8-968a-a3408c6044d6\" (UID: \"c0369c6f-517b-44b8-968a-a3408c6044d6\") " Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.294523 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl2rd\" (UniqueName: \"kubernetes.io/projected/c0369c6f-517b-44b8-968a-a3408c6044d6-kube-api-access-kl2rd\") pod \"c0369c6f-517b-44b8-968a-a3408c6044d6\" (UID: \"c0369c6f-517b-44b8-968a-a3408c6044d6\") " Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.294949 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0369c6f-517b-44b8-968a-a3408c6044d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0369c6f-517b-44b8-968a-a3408c6044d6" (UID: "c0369c6f-517b-44b8-968a-a3408c6044d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.299847 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0369c6f-517b-44b8-968a-a3408c6044d6-kube-api-access-kl2rd" (OuterVolumeSpecName: "kube-api-access-kl2rd") pod "c0369c6f-517b-44b8-968a-a3408c6044d6" (UID: "c0369c6f-517b-44b8-968a-a3408c6044d6"). InnerVolumeSpecName "kube-api-access-kl2rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.396056 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0369c6f-517b-44b8-968a-a3408c6044d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.396087 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl2rd\" (UniqueName: \"kubernetes.io/projected/c0369c6f-517b-44b8-968a-a3408c6044d6-kube-api-access-kl2rd\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.519733 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:55:39 crc kubenswrapper[4795]: E0219 22:55:39.520104 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.846300 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xbwb6" event={"ID":"c0369c6f-517b-44b8-968a-a3408c6044d6","Type":"ContainerDied","Data":"798c223eb760c97067b30610ba4e1b170b3df8ec61f6c7acc66d63b4565f14de"} Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.846348 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="798c223eb760c97067b30610ba4e1b170b3df8ec61f6c7acc66d63b4565f14de" Feb 19 22:55:39 crc kubenswrapper[4795]: I0219 22:55:39.846627 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xbwb6" Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.171413 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.309658 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bks4\" (UniqueName: \"kubernetes.io/projected/635044d2-10e8-457c-b03e-9507a500c7fe-kube-api-access-4bks4\") pod \"635044d2-10e8-457c-b03e-9507a500c7fe\" (UID: \"635044d2-10e8-457c-b03e-9507a500c7fe\") " Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.309747 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/635044d2-10e8-457c-b03e-9507a500c7fe-operator-scripts\") pod \"635044d2-10e8-457c-b03e-9507a500c7fe\" (UID: \"635044d2-10e8-457c-b03e-9507a500c7fe\") " Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.310501 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/635044d2-10e8-457c-b03e-9507a500c7fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "635044d2-10e8-457c-b03e-9507a500c7fe" (UID: "635044d2-10e8-457c-b03e-9507a500c7fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.314137 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/635044d2-10e8-457c-b03e-9507a500c7fe-kube-api-access-4bks4" (OuterVolumeSpecName: "kube-api-access-4bks4") pod "635044d2-10e8-457c-b03e-9507a500c7fe" (UID: "635044d2-10e8-457c-b03e-9507a500c7fe"). InnerVolumeSpecName "kube-api-access-4bks4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.411665 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bks4\" (UniqueName: \"kubernetes.io/projected/635044d2-10e8-457c-b03e-9507a500c7fe-kube-api-access-4bks4\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.411719 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/635044d2-10e8-457c-b03e-9507a500c7fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.858343 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f69d-account-create-update-gbq6r" event={"ID":"635044d2-10e8-457c-b03e-9507a500c7fe","Type":"ContainerDied","Data":"2fa48bc04516a1246fcfd55e60c6c0e0408ad4f81c112f1c19ad75fdbaeda045"} Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.858385 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fa48bc04516a1246fcfd55e60c6c0e0408ad4f81c112f1c19ad75fdbaeda045" Feb 19 22:55:40 crc kubenswrapper[4795]: I0219 22:55:40.858412 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f69d-account-create-update-gbq6r" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.926963 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-p5rjh"] Feb 19 22:55:41 crc kubenswrapper[4795]: E0219 22:55:41.927334 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="635044d2-10e8-457c-b03e-9507a500c7fe" containerName="mariadb-account-create-update" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.927346 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="635044d2-10e8-457c-b03e-9507a500c7fe" containerName="mariadb-account-create-update" Feb 19 22:55:41 crc kubenswrapper[4795]: E0219 22:55:41.927375 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0369c6f-517b-44b8-968a-a3408c6044d6" containerName="mariadb-database-create" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.927381 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0369c6f-517b-44b8-968a-a3408c6044d6" containerName="mariadb-database-create" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.927514 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="635044d2-10e8-457c-b03e-9507a500c7fe" containerName="mariadb-account-create-update" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.927532 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0369c6f-517b-44b8-968a-a3408c6044d6" containerName="mariadb-database-create" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.928137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.929939 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-lsmm9" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.930174 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.930484 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 22:55:41 crc kubenswrapper[4795]: I0219 22:55:41.935102 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-p5rjh"] Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.036664 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-config\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.036756 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zhn4\" (UniqueName: \"kubernetes.io/projected/9ad0e107-d857-4118-9582-5039b45f1ec8-kube-api-access-9zhn4\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.036803 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-combined-ca-bundle\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.138312 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zhn4\" (UniqueName: \"kubernetes.io/projected/9ad0e107-d857-4118-9582-5039b45f1ec8-kube-api-access-9zhn4\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.138401 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-combined-ca-bundle\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.138463 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-config\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.142442 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-combined-ca-bundle\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.143391 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-config\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.157059 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zhn4\" (UniqueName: \"kubernetes.io/projected/9ad0e107-d857-4118-9582-5039b45f1ec8-kube-api-access-9zhn4\") pod \"neutron-db-sync-p5rjh\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.244673 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.678291 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-p5rjh"] Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.873249 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p5rjh" event={"ID":"9ad0e107-d857-4118-9582-5039b45f1ec8","Type":"ContainerStarted","Data":"a01a032519bdc879e7f051388c3f7e0f8289504340bef813d7d1864b844d8771"} Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.873539 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p5rjh" event={"ID":"9ad0e107-d857-4118-9582-5039b45f1ec8","Type":"ContainerStarted","Data":"d3eea427f09096b213db3bad61bbfc80ea6959119d0cf31b074397e7c578d1fa"} Feb 19 22:55:42 crc kubenswrapper[4795]: I0219 22:55:42.893776 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-p5rjh" podStartSLOduration=1.893755565 podStartE2EDuration="1.893755565s" podCreationTimestamp="2026-02-19 22:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:42.890124261 +0000 UTC m=+5254.082642125" watchObservedRunningTime="2026-02-19 22:55:42.893755565 +0000 UTC m=+5254.086273429" Feb 19 22:55:46 crc kubenswrapper[4795]: I0219 22:55:46.911214 4795 generic.go:334] "Generic (PLEG): container finished" podID="9ad0e107-d857-4118-9582-5039b45f1ec8" containerID="a01a032519bdc879e7f051388c3f7e0f8289504340bef813d7d1864b844d8771" exitCode=0 Feb 19 22:55:46 crc kubenswrapper[4795]: I0219 22:55:46.911424 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p5rjh" event={"ID":"9ad0e107-d857-4118-9582-5039b45f1ec8","Type":"ContainerDied","Data":"a01a032519bdc879e7f051388c3f7e0f8289504340bef813d7d1864b844d8771"} Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.258877 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.341683 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-config\") pod \"9ad0e107-d857-4118-9582-5039b45f1ec8\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.341792 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zhn4\" (UniqueName: \"kubernetes.io/projected/9ad0e107-d857-4118-9582-5039b45f1ec8-kube-api-access-9zhn4\") pod \"9ad0e107-d857-4118-9582-5039b45f1ec8\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.342710 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-combined-ca-bundle\") pod \"9ad0e107-d857-4118-9582-5039b45f1ec8\" (UID: \"9ad0e107-d857-4118-9582-5039b45f1ec8\") " Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.348764 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ad0e107-d857-4118-9582-5039b45f1ec8-kube-api-access-9zhn4" (OuterVolumeSpecName: "kube-api-access-9zhn4") pod "9ad0e107-d857-4118-9582-5039b45f1ec8" (UID: "9ad0e107-d857-4118-9582-5039b45f1ec8"). InnerVolumeSpecName "kube-api-access-9zhn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.366981 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ad0e107-d857-4118-9582-5039b45f1ec8" (UID: "9ad0e107-d857-4118-9582-5039b45f1ec8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.384740 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-config" (OuterVolumeSpecName: "config") pod "9ad0e107-d857-4118-9582-5039b45f1ec8" (UID: "9ad0e107-d857-4118-9582-5039b45f1ec8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.445024 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.445058 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zhn4\" (UniqueName: \"kubernetes.io/projected/9ad0e107-d857-4118-9582-5039b45f1ec8-kube-api-access-9zhn4\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.445072 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ad0e107-d857-4118-9582-5039b45f1ec8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.929470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p5rjh" event={"ID":"9ad0e107-d857-4118-9582-5039b45f1ec8","Type":"ContainerDied","Data":"d3eea427f09096b213db3bad61bbfc80ea6959119d0cf31b074397e7c578d1fa"} Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.929511 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3eea427f09096b213db3bad61bbfc80ea6959119d0cf31b074397e7c578d1fa" Feb 19 22:55:48 crc kubenswrapper[4795]: I0219 22:55:48.929887 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p5rjh" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.180455 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d5d85df8f-clq65"] Feb 19 22:55:49 crc kubenswrapper[4795]: E0219 22:55:49.180795 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ad0e107-d857-4118-9582-5039b45f1ec8" containerName="neutron-db-sync" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.180816 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ad0e107-d857-4118-9582-5039b45f1ec8" containerName="neutron-db-sync" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.180995 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ad0e107-d857-4118-9582-5039b45f1ec8" containerName="neutron-db-sync" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.181899 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.204686 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d5d85df8f-clq65"] Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.272908 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7945766d5c-fjptf"] Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.274358 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.278586 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.278906 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.279106 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-lsmm9" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.279769 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.279819 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-config\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.279872 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-dns-svc\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.279964 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl457\" (UniqueName: \"kubernetes.io/projected/8e5a1fbd-4617-434e-8719-12b16bc88b98-kube-api-access-cl457\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.279996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.285755 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7945766d5c-fjptf"] Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382015 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmx67\" (UniqueName: \"kubernetes.io/projected/dea0417f-0988-4d82-80cc-03298be367bd-kube-api-access-xmx67\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382107 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl457\" (UniqueName: \"kubernetes.io/projected/8e5a1fbd-4617-434e-8719-12b16bc88b98-kube-api-access-cl457\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382237 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382268 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-config\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382308 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-combined-ca-bundle\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382349 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-dns-svc\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382373 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-httpd-config\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.382430 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-config\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.383027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.383358 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.383397 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-config\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.383543 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-dns-svc\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.400397 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl457\" (UniqueName: \"kubernetes.io/projected/8e5a1fbd-4617-434e-8719-12b16bc88b98-kube-api-access-cl457\") pod \"dnsmasq-dns-7d5d85df8f-clq65\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.483702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-config\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.484086 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmx67\" (UniqueName: \"kubernetes.io/projected/dea0417f-0988-4d82-80cc-03298be367bd-kube-api-access-xmx67\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.484186 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-combined-ca-bundle\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.484217 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-httpd-config\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.487488 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-httpd-config\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.487836 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-config\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.488768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea0417f-0988-4d82-80cc-03298be367bd-combined-ca-bundle\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.500212 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.506706 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmx67\" (UniqueName: \"kubernetes.io/projected/dea0417f-0988-4d82-80cc-03298be367bd-kube-api-access-xmx67\") pod \"neutron-7945766d5c-fjptf\" (UID: \"dea0417f-0988-4d82-80cc-03298be367bd\") " pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:49 crc kubenswrapper[4795]: I0219 22:55:49.603215 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.054192 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d5d85df8f-clq65"] Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.364976 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7945766d5c-fjptf"] Feb 19 22:55:50 crc kubenswrapper[4795]: W0219 22:55:50.379702 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddea0417f_0988_4d82_80cc_03298be367bd.slice/crio-ff1abb28fae1a83463f38eacf392a3526e7e57c91627771bd880aebeac689880 WatchSource:0}: Error finding container ff1abb28fae1a83463f38eacf392a3526e7e57c91627771bd880aebeac689880: Status 404 returned error can't find the container with id ff1abb28fae1a83463f38eacf392a3526e7e57c91627771bd880aebeac689880 Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.943763 4795 generic.go:334] "Generic (PLEG): container finished" podID="8e5a1fbd-4617-434e-8719-12b16bc88b98" containerID="400d407c08a50e5c293b398a82df6a8f79475760e406fca1855e5558b33cb384" exitCode=0 Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.943872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" event={"ID":"8e5a1fbd-4617-434e-8719-12b16bc88b98","Type":"ContainerDied","Data":"400d407c08a50e5c293b398a82df6a8f79475760e406fca1855e5558b33cb384"} Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.944199 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" event={"ID":"8e5a1fbd-4617-434e-8719-12b16bc88b98","Type":"ContainerStarted","Data":"7217af4cd4eb00396c1a57059f86739ac95943d868ddbc9e0af3ab209b2339ee"} Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.946713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7945766d5c-fjptf" event={"ID":"dea0417f-0988-4d82-80cc-03298be367bd","Type":"ContainerStarted","Data":"78fe4dbc7e56ab8394f76ce36d07a23856dbfc8149134c7fe2e37c98f537d806"} Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.946750 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7945766d5c-fjptf" event={"ID":"dea0417f-0988-4d82-80cc-03298be367bd","Type":"ContainerStarted","Data":"7aab37ce10ab6572e88c1a662fda02629c82d89034781cfe1e5b1ae472dfa9cf"} Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.946765 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7945766d5c-fjptf" event={"ID":"dea0417f-0988-4d82-80cc-03298be367bd","Type":"ContainerStarted","Data":"ff1abb28fae1a83463f38eacf392a3526e7e57c91627771bd880aebeac689880"} Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.946867 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:55:50 crc kubenswrapper[4795]: I0219 22:55:50.996640 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7945766d5c-fjptf" podStartSLOduration=1.996619224 podStartE2EDuration="1.996619224s" podCreationTimestamp="2026-02-19 22:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:50.992531737 +0000 UTC m=+5262.185049601" watchObservedRunningTime="2026-02-19 22:55:50.996619224 +0000 UTC m=+5262.189137088" Feb 19 22:55:51 crc kubenswrapper[4795]: I0219 22:55:51.511278 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:55:51 crc kubenswrapper[4795]: E0219 22:55:51.511571 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 22:55:51 crc kubenswrapper[4795]: I0219 22:55:51.956704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" event={"ID":"8e5a1fbd-4617-434e-8719-12b16bc88b98","Type":"ContainerStarted","Data":"db3ac61fe2bbba8663fc1cafd3f78c8dde189ed879980b2ca339cc53d3ebb615"} Feb 19 22:55:51 crc kubenswrapper[4795]: I0219 22:55:51.957037 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:51 crc kubenswrapper[4795]: I0219 22:55:51.983534 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" podStartSLOduration=2.983514323 podStartE2EDuration="2.983514323s" podCreationTimestamp="2026-02-19 22:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:55:51.974433252 +0000 UTC m=+5263.166951136" watchObservedRunningTime="2026-02-19 22:55:51.983514323 +0000 UTC m=+5263.176032187" Feb 19 22:55:59 crc kubenswrapper[4795]: I0219 22:55:59.502350 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:55:59 crc kubenswrapper[4795]: I0219 22:55:59.558523 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc65d4d5f-tt9nm"] Feb 19 22:55:59 crc kubenswrapper[4795]: I0219 22:55:59.558765 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" podUID="0f5b30f1-1278-4376-b1bc-6e72def4d494" containerName="dnsmasq-dns" containerID="cri-o://be2f6f4f8f1c84b00fee1744df7613e5b2cb7c534a7ea1884cb10ab8da51e6b9" gracePeriod=10 Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.023090 4795 generic.go:334] "Generic (PLEG): container finished" podID="0f5b30f1-1278-4376-b1bc-6e72def4d494" containerID="be2f6f4f8f1c84b00fee1744df7613e5b2cb7c534a7ea1884cb10ab8da51e6b9" exitCode=0 Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.023129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" event={"ID":"0f5b30f1-1278-4376-b1bc-6e72def4d494","Type":"ContainerDied","Data":"be2f6f4f8f1c84b00fee1744df7613e5b2cb7c534a7ea1884cb10ab8da51e6b9"} Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.023457 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" event={"ID":"0f5b30f1-1278-4376-b1bc-6e72def4d494","Type":"ContainerDied","Data":"0ababbb8a0c165fe874ce96e1fb589c25f4808501b4ec3a8ba5a072fa951fb3d"} Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.023474 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ababbb8a0c165fe874ce96e1fb589c25f4808501b4ec3a8ba5a072fa951fb3d" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.080096 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.265474 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-sb\") pod \"0f5b30f1-1278-4376-b1bc-6e72def4d494\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.265593 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-nb\") pod \"0f5b30f1-1278-4376-b1bc-6e72def4d494\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.265643 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-config\") pod \"0f5b30f1-1278-4376-b1bc-6e72def4d494\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.265693 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-dns-svc\") pod \"0f5b30f1-1278-4376-b1bc-6e72def4d494\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.265740 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwwxp\" (UniqueName: \"kubernetes.io/projected/0f5b30f1-1278-4376-b1bc-6e72def4d494-kube-api-access-bwwxp\") pod \"0f5b30f1-1278-4376-b1bc-6e72def4d494\" (UID: \"0f5b30f1-1278-4376-b1bc-6e72def4d494\") " Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.273693 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f5b30f1-1278-4376-b1bc-6e72def4d494-kube-api-access-bwwxp" (OuterVolumeSpecName: "kube-api-access-bwwxp") pod "0f5b30f1-1278-4376-b1bc-6e72def4d494" (UID: "0f5b30f1-1278-4376-b1bc-6e72def4d494"). InnerVolumeSpecName "kube-api-access-bwwxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.317546 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-config" (OuterVolumeSpecName: "config") pod "0f5b30f1-1278-4376-b1bc-6e72def4d494" (UID: "0f5b30f1-1278-4376-b1bc-6e72def4d494"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.332132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0f5b30f1-1278-4376-b1bc-6e72def4d494" (UID: "0f5b30f1-1278-4376-b1bc-6e72def4d494"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.341271 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f5b30f1-1278-4376-b1bc-6e72def4d494" (UID: "0f5b30f1-1278-4376-b1bc-6e72def4d494"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.355872 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0f5b30f1-1278-4376-b1bc-6e72def4d494" (UID: "0f5b30f1-1278-4376-b1bc-6e72def4d494"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.367629 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.367665 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwwxp\" (UniqueName: \"kubernetes.io/projected/0f5b30f1-1278-4376-b1bc-6e72def4d494-kube-api-access-bwwxp\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.367676 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.367684 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:00 crc kubenswrapper[4795]: I0219 22:56:00.367692 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5b30f1-1278-4376-b1bc-6e72def4d494-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:01 crc kubenswrapper[4795]: I0219 22:56:01.030582 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc65d4d5f-tt9nm" Feb 19 22:56:01 crc kubenswrapper[4795]: I0219 22:56:01.065496 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc65d4d5f-tt9nm"] Feb 19 22:56:01 crc kubenswrapper[4795]: I0219 22:56:01.071192 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dc65d4d5f-tt9nm"] Feb 19 22:56:01 crc kubenswrapper[4795]: I0219 22:56:01.521527 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f5b30f1-1278-4376-b1bc-6e72def4d494" path="/var/lib/kubelet/pods/0f5b30f1-1278-4376-b1bc-6e72def4d494/volumes" Feb 19 22:56:06 crc kubenswrapper[4795]: I0219 22:56:06.512549 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:56:07 crc kubenswrapper[4795]: I0219 22:56:07.081804 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"fd820b5d0adc1705d78ec76939670a71a79ca07206c8d0459e23712d0b015f16"} Feb 19 22:56:19 crc kubenswrapper[4795]: I0219 22:56:19.616016 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7945766d5c-fjptf" Feb 19 22:56:26 crc kubenswrapper[4795]: I0219 22:56:26.912231 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-57skz"] Feb 19 22:56:26 crc kubenswrapper[4795]: E0219 22:56:26.913069 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f5b30f1-1278-4376-b1bc-6e72def4d494" containerName="dnsmasq-dns" Feb 19 22:56:26 crc kubenswrapper[4795]: I0219 22:56:26.913086 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f5b30f1-1278-4376-b1bc-6e72def4d494" containerName="dnsmasq-dns" Feb 19 22:56:26 crc kubenswrapper[4795]: E0219 22:56:26.913110 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f5b30f1-1278-4376-b1bc-6e72def4d494" containerName="init" Feb 19 22:56:26 crc kubenswrapper[4795]: I0219 22:56:26.913119 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f5b30f1-1278-4376-b1bc-6e72def4d494" containerName="init" Feb 19 22:56:26 crc kubenswrapper[4795]: I0219 22:56:26.913359 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f5b30f1-1278-4376-b1bc-6e72def4d494" containerName="dnsmasq-dns" Feb 19 22:56:26 crc kubenswrapper[4795]: I0219 22:56:26.914040 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-57skz" Feb 19 22:56:26 crc kubenswrapper[4795]: I0219 22:56:26.922375 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-57skz"] Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.015555 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c270-account-create-update-m9p4w"] Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.016526 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.019031 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.026342 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c270-account-create-update-m9p4w"] Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.069411 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5efc0b6-7441-4f4b-827e-d920c711d076-operator-scripts\") pod \"glance-db-create-57skz\" (UID: \"b5efc0b6-7441-4f4b-827e-d920c711d076\") " pod="openstack/glance-db-create-57skz" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.069481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjr5c\" (UniqueName: \"kubernetes.io/projected/b5efc0b6-7441-4f4b-827e-d920c711d076-kube-api-access-qjr5c\") pod \"glance-db-create-57skz\" (UID: \"b5efc0b6-7441-4f4b-827e-d920c711d076\") " pod="openstack/glance-db-create-57skz" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.171371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njq24\" (UniqueName: \"kubernetes.io/projected/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-kube-api-access-njq24\") pod \"glance-c270-account-create-update-m9p4w\" (UID: \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\") " pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.171684 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5efc0b6-7441-4f4b-827e-d920c711d076-operator-scripts\") pod \"glance-db-create-57skz\" (UID: \"b5efc0b6-7441-4f4b-827e-d920c711d076\") " pod="openstack/glance-db-create-57skz" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.171729 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjr5c\" (UniqueName: \"kubernetes.io/projected/b5efc0b6-7441-4f4b-827e-d920c711d076-kube-api-access-qjr5c\") pod \"glance-db-create-57skz\" (UID: \"b5efc0b6-7441-4f4b-827e-d920c711d076\") " pod="openstack/glance-db-create-57skz" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.171778 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-operator-scripts\") pod \"glance-c270-account-create-update-m9p4w\" (UID: \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\") " pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.172445 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5efc0b6-7441-4f4b-827e-d920c711d076-operator-scripts\") pod \"glance-db-create-57skz\" (UID: \"b5efc0b6-7441-4f4b-827e-d920c711d076\") " pod="openstack/glance-db-create-57skz" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.197062 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjr5c\" (UniqueName: \"kubernetes.io/projected/b5efc0b6-7441-4f4b-827e-d920c711d076-kube-api-access-qjr5c\") pod \"glance-db-create-57skz\" (UID: \"b5efc0b6-7441-4f4b-827e-d920c711d076\") " pod="openstack/glance-db-create-57skz" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.273708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-operator-scripts\") pod \"glance-c270-account-create-update-m9p4w\" (UID: \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\") " pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.273818 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njq24\" (UniqueName: \"kubernetes.io/projected/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-kube-api-access-njq24\") pod \"glance-c270-account-create-update-m9p4w\" (UID: \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\") " pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.274893 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-operator-scripts\") pod \"glance-c270-account-create-update-m9p4w\" (UID: \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\") " pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.280281 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-57skz" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.293766 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njq24\" (UniqueName: \"kubernetes.io/projected/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-kube-api-access-njq24\") pod \"glance-c270-account-create-update-m9p4w\" (UID: \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\") " pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.331415 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.730013 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-57skz"] Feb 19 22:56:27 crc kubenswrapper[4795]: I0219 22:56:27.817455 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c270-account-create-update-m9p4w"] Feb 19 22:56:28 crc kubenswrapper[4795]: I0219 22:56:28.261645 4795 generic.go:334] "Generic (PLEG): container finished" podID="e6d0c29a-694d-4afc-ba36-c66fa8fd0328" containerID="ce12984ea586896da4a3a2ca9a12c46a23d3b89ee0886c7e9ee2b6ceb73add38" exitCode=0 Feb 19 22:56:28 crc kubenswrapper[4795]: I0219 22:56:28.261723 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c270-account-create-update-m9p4w" event={"ID":"e6d0c29a-694d-4afc-ba36-c66fa8fd0328","Type":"ContainerDied","Data":"ce12984ea586896da4a3a2ca9a12c46a23d3b89ee0886c7e9ee2b6ceb73add38"} Feb 19 22:56:28 crc kubenswrapper[4795]: I0219 22:56:28.261752 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c270-account-create-update-m9p4w" event={"ID":"e6d0c29a-694d-4afc-ba36-c66fa8fd0328","Type":"ContainerStarted","Data":"e113f538a1a32a9c0a8de19728d882d6568b7f3bce7fabcc99c168487e7235c9"} Feb 19 22:56:28 crc kubenswrapper[4795]: I0219 22:56:28.271658 4795 generic.go:334] "Generic (PLEG): container finished" podID="b5efc0b6-7441-4f4b-827e-d920c711d076" containerID="406b01e5656ffc8e932bd5ff0af64ce799789e55eeb816cb84ffd89f44da61ef" exitCode=0 Feb 19 22:56:28 crc kubenswrapper[4795]: I0219 22:56:28.271693 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-57skz" event={"ID":"b5efc0b6-7441-4f4b-827e-d920c711d076","Type":"ContainerDied","Data":"406b01e5656ffc8e932bd5ff0af64ce799789e55eeb816cb84ffd89f44da61ef"} Feb 19 22:56:28 crc kubenswrapper[4795]: I0219 22:56:28.271716 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-57skz" event={"ID":"b5efc0b6-7441-4f4b-827e-d920c711d076","Type":"ContainerStarted","Data":"6b2c41ca69ed4deb69d1b1332d8a3fd6cd1ed218f442a9619c93b3ed3970049a"} Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.646621 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.651522 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-57skz" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.819327 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njq24\" (UniqueName: \"kubernetes.io/projected/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-kube-api-access-njq24\") pod \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\" (UID: \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\") " Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.819454 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-operator-scripts\") pod \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\" (UID: \"e6d0c29a-694d-4afc-ba36-c66fa8fd0328\") " Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.819474 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5efc0b6-7441-4f4b-827e-d920c711d076-operator-scripts\") pod \"b5efc0b6-7441-4f4b-827e-d920c711d076\" (UID: \"b5efc0b6-7441-4f4b-827e-d920c711d076\") " Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.819543 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjr5c\" (UniqueName: \"kubernetes.io/projected/b5efc0b6-7441-4f4b-827e-d920c711d076-kube-api-access-qjr5c\") pod \"b5efc0b6-7441-4f4b-827e-d920c711d076\" (UID: \"b5efc0b6-7441-4f4b-827e-d920c711d076\") " Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.820070 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5efc0b6-7441-4f4b-827e-d920c711d076-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5efc0b6-7441-4f4b-827e-d920c711d076" (UID: "b5efc0b6-7441-4f4b-827e-d920c711d076"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.820092 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6d0c29a-694d-4afc-ba36-c66fa8fd0328" (UID: "e6d0c29a-694d-4afc-ba36-c66fa8fd0328"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.824837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5efc0b6-7441-4f4b-827e-d920c711d076-kube-api-access-qjr5c" (OuterVolumeSpecName: "kube-api-access-qjr5c") pod "b5efc0b6-7441-4f4b-827e-d920c711d076" (UID: "b5efc0b6-7441-4f4b-827e-d920c711d076"). InnerVolumeSpecName "kube-api-access-qjr5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.824929 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-kube-api-access-njq24" (OuterVolumeSpecName: "kube-api-access-njq24") pod "e6d0c29a-694d-4afc-ba36-c66fa8fd0328" (UID: "e6d0c29a-694d-4afc-ba36-c66fa8fd0328"). InnerVolumeSpecName "kube-api-access-njq24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.922807 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjr5c\" (UniqueName: \"kubernetes.io/projected/b5efc0b6-7441-4f4b-827e-d920c711d076-kube-api-access-qjr5c\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.922845 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njq24\" (UniqueName: \"kubernetes.io/projected/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-kube-api-access-njq24\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.922989 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d0c29a-694d-4afc-ba36-c66fa8fd0328-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:29 crc kubenswrapper[4795]: I0219 22:56:29.923001 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5efc0b6-7441-4f4b-827e-d920c711d076-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:30 crc kubenswrapper[4795]: I0219 22:56:30.289897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c270-account-create-update-m9p4w" event={"ID":"e6d0c29a-694d-4afc-ba36-c66fa8fd0328","Type":"ContainerDied","Data":"e113f538a1a32a9c0a8de19728d882d6568b7f3bce7fabcc99c168487e7235c9"} Feb 19 22:56:30 crc kubenswrapper[4795]: I0219 22:56:30.290251 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e113f538a1a32a9c0a8de19728d882d6568b7f3bce7fabcc99c168487e7235c9" Feb 19 22:56:30 crc kubenswrapper[4795]: I0219 22:56:30.289956 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c270-account-create-update-m9p4w" Feb 19 22:56:30 crc kubenswrapper[4795]: I0219 22:56:30.292119 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-57skz" event={"ID":"b5efc0b6-7441-4f4b-827e-d920c711d076","Type":"ContainerDied","Data":"6b2c41ca69ed4deb69d1b1332d8a3fd6cd1ed218f442a9619c93b3ed3970049a"} Feb 19 22:56:30 crc kubenswrapper[4795]: I0219 22:56:30.292211 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b2c41ca69ed4deb69d1b1332d8a3fd6cd1ed218f442a9619c93b3ed3970049a" Feb 19 22:56:30 crc kubenswrapper[4795]: I0219 22:56:30.292245 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-57skz" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.248586 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-wrz6p"] Feb 19 22:56:32 crc kubenswrapper[4795]: E0219 22:56:32.248992 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5efc0b6-7441-4f4b-827e-d920c711d076" containerName="mariadb-database-create" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.249007 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5efc0b6-7441-4f4b-827e-d920c711d076" containerName="mariadb-database-create" Feb 19 22:56:32 crc kubenswrapper[4795]: E0219 22:56:32.249018 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d0c29a-694d-4afc-ba36-c66fa8fd0328" containerName="mariadb-account-create-update" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.249027 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d0c29a-694d-4afc-ba36-c66fa8fd0328" containerName="mariadb-account-create-update" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.249278 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d0c29a-694d-4afc-ba36-c66fa8fd0328" containerName="mariadb-account-create-update" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.249304 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5efc0b6-7441-4f4b-827e-d920c711d076" containerName="mariadb-database-create" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.249931 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.251947 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.252103 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8gwsm" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.260401 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wrz6p"] Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.363892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-combined-ca-bundle\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.363947 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-db-sync-config-data\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.363972 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-config-data\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.364285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxkb7\" (UniqueName: \"kubernetes.io/projected/3e95033f-725f-4784-995c-ec7a3b9c24c4-kube-api-access-bxkb7\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.465885 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxkb7\" (UniqueName: \"kubernetes.io/projected/3e95033f-725f-4784-995c-ec7a3b9c24c4-kube-api-access-bxkb7\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.465960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-combined-ca-bundle\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.465992 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-db-sync-config-data\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.466012 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-config-data\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.471454 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-db-sync-config-data\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.471630 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-config-data\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.480340 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-combined-ca-bundle\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.481388 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxkb7\" (UniqueName: \"kubernetes.io/projected/3e95033f-725f-4784-995c-ec7a3b9c24c4-kube-api-access-bxkb7\") pod \"glance-db-sync-wrz6p\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:32 crc kubenswrapper[4795]: I0219 22:56:32.579902 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:33 crc kubenswrapper[4795]: I0219 22:56:33.106842 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wrz6p"] Feb 19 22:56:33 crc kubenswrapper[4795]: I0219 22:56:33.316147 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wrz6p" event={"ID":"3e95033f-725f-4784-995c-ec7a3b9c24c4","Type":"ContainerStarted","Data":"a0569c78b70090d7fd22b175f603c4cf568abb23fa79c1c5271376d6515bd29a"} Feb 19 22:56:34 crc kubenswrapper[4795]: I0219 22:56:34.334069 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wrz6p" event={"ID":"3e95033f-725f-4784-995c-ec7a3b9c24c4","Type":"ContainerStarted","Data":"b7a9bfd4cc9f1de56acd0e84791ea820651eb87fe9629808f92a9d7ea31fba08"} Feb 19 22:56:34 crc kubenswrapper[4795]: I0219 22:56:34.370614 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-wrz6p" podStartSLOduration=2.37059393 podStartE2EDuration="2.37059393s" podCreationTimestamp="2026-02-19 22:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:56:34.367917642 +0000 UTC m=+5305.560435536" watchObservedRunningTime="2026-02-19 22:56:34.37059393 +0000 UTC m=+5305.563111794" Feb 19 22:56:37 crc kubenswrapper[4795]: I0219 22:56:37.356534 4795 generic.go:334] "Generic (PLEG): container finished" podID="3e95033f-725f-4784-995c-ec7a3b9c24c4" containerID="b7a9bfd4cc9f1de56acd0e84791ea820651eb87fe9629808f92a9d7ea31fba08" exitCode=0 Feb 19 22:56:37 crc kubenswrapper[4795]: I0219 22:56:37.356636 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wrz6p" event={"ID":"3e95033f-725f-4784-995c-ec7a3b9c24c4","Type":"ContainerDied","Data":"b7a9bfd4cc9f1de56acd0e84791ea820651eb87fe9629808f92a9d7ea31fba08"} Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.796287 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.902652 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-combined-ca-bundle\") pod \"3e95033f-725f-4784-995c-ec7a3b9c24c4\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.902705 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-config-data\") pod \"3e95033f-725f-4784-995c-ec7a3b9c24c4\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.902794 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxkb7\" (UniqueName: \"kubernetes.io/projected/3e95033f-725f-4784-995c-ec7a3b9c24c4-kube-api-access-bxkb7\") pod \"3e95033f-725f-4784-995c-ec7a3b9c24c4\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.902928 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-db-sync-config-data\") pod \"3e95033f-725f-4784-995c-ec7a3b9c24c4\" (UID: \"3e95033f-725f-4784-995c-ec7a3b9c24c4\") " Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.907276 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e95033f-725f-4784-995c-ec7a3b9c24c4-kube-api-access-bxkb7" (OuterVolumeSpecName: "kube-api-access-bxkb7") pod "3e95033f-725f-4784-995c-ec7a3b9c24c4" (UID: "3e95033f-725f-4784-995c-ec7a3b9c24c4"). InnerVolumeSpecName "kube-api-access-bxkb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.916262 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3e95033f-725f-4784-995c-ec7a3b9c24c4" (UID: "3e95033f-725f-4784-995c-ec7a3b9c24c4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.925416 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e95033f-725f-4784-995c-ec7a3b9c24c4" (UID: "3e95033f-725f-4784-995c-ec7a3b9c24c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:38 crc kubenswrapper[4795]: I0219 22:56:38.949223 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-config-data" (OuterVolumeSpecName: "config-data") pod "3e95033f-725f-4784-995c-ec7a3b9c24c4" (UID: "3e95033f-725f-4784-995c-ec7a3b9c24c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.004908 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.004942 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.004956 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e95033f-725f-4784-995c-ec7a3b9c24c4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.005008 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxkb7\" (UniqueName: \"kubernetes.io/projected/3e95033f-725f-4784-995c-ec7a3b9c24c4-kube-api-access-bxkb7\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.380210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wrz6p" event={"ID":"3e95033f-725f-4784-995c-ec7a3b9c24c4","Type":"ContainerDied","Data":"a0569c78b70090d7fd22b175f603c4cf568abb23fa79c1c5271376d6515bd29a"} Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.380312 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0569c78b70090d7fd22b175f603c4cf568abb23fa79c1c5271376d6515bd29a" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.380232 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wrz6p" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.675465 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:39 crc kubenswrapper[4795]: E0219 22:56:39.676255 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e95033f-725f-4784-995c-ec7a3b9c24c4" containerName="glance-db-sync" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.676279 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e95033f-725f-4784-995c-ec7a3b9c24c4" containerName="glance-db-sync" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.676504 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e95033f-725f-4784-995c-ec7a3b9c24c4" containerName="glance-db-sync" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.677511 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.688892 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.689304 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.689337 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8gwsm" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.689543 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.710903 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.765903 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-679b7b556f-vb5wj"] Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.767210 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.778703 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-679b7b556f-vb5wj"] Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.819280 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-logs\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.819361 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-config-data\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.819384 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.819406 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-ceph\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.819424 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pdqx\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-kube-api-access-8pdqx\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.819509 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.819539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-scripts\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.862852 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.864714 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.869366 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.874179 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pdqx\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-kube-api-access-8pdqx\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920574 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-ceph\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920635 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920658 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-scripts\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920727 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-config\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920758 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-nb\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920800 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-logs\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920841 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-dns-svc\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920869 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-sb\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920898 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmn7q\" (UniqueName: \"kubernetes.io/projected/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-kube-api-access-bmn7q\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920940 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-config-data\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.920965 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.921588 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.924845 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-logs\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.925019 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-ceph\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.927064 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.932894 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-scripts\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.935885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-config-data\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.943522 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pdqx\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-kube-api-access-8pdqx\") pod \"glance-default-external-api-0\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:39 crc kubenswrapper[4795]: I0219 22:56:39.997935 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022128 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-config-data\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdwdh\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-kube-api-access-kdwdh\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022244 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022269 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022353 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-ceph\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022385 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-config\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022407 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-logs\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022428 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-nb\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022462 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-scripts\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-dns-svc\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-sb\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.022551 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmn7q\" (UniqueName: \"kubernetes.io/projected/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-kube-api-access-bmn7q\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.023464 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-config\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.023647 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-nb\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.024216 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-dns-svc\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.025282 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-sb\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.040987 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmn7q\" (UniqueName: \"kubernetes.io/projected/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-kube-api-access-bmn7q\") pod \"dnsmasq-dns-679b7b556f-vb5wj\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.085494 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.124252 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-config-data\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.124542 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdwdh\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-kube-api-access-kdwdh\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.124979 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.128163 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.128722 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-ceph\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.128754 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-logs\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.128865 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-scripts\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.129254 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.129564 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-logs\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.133794 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.137156 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-ceph\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.142373 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-config-data\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.144320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-scripts\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.146330 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdwdh\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-kube-api-access-kdwdh\") pod \"glance-default-internal-api-0\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.306058 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.658692 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-679b7b556f-vb5wj"] Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.691109 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.926929 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:40 crc kubenswrapper[4795]: I0219 22:56:40.986642 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:40 crc kubenswrapper[4795]: W0219 22:56:40.998104 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod388d272a_cffa_4321_ac91_648accbf6930.slice/crio-1345bf8f0b6f81825f1c42cd0e670dbc13d5673dca7862d996402d988d2684ef WatchSource:0}: Error finding container 1345bf8f0b6f81825f1c42cd0e670dbc13d5673dca7862d996402d988d2684ef: Status 404 returned error can't find the container with id 1345bf8f0b6f81825f1c42cd0e670dbc13d5673dca7862d996402d988d2684ef Feb 19 22:56:41 crc kubenswrapper[4795]: I0219 22:56:41.405785 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13c7f8fe-962d-47d0-9607-f121e0c6a38d","Type":"ContainerStarted","Data":"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77"} Feb 19 22:56:41 crc kubenswrapper[4795]: I0219 22:56:41.406279 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13c7f8fe-962d-47d0-9607-f121e0c6a38d","Type":"ContainerStarted","Data":"cd0b6e7125b0cc4a617c794275fd47b2d7fa67576e3ce216320de639f679c6fb"} Feb 19 22:56:41 crc kubenswrapper[4795]: I0219 22:56:41.415312 4795 generic.go:334] "Generic (PLEG): container finished" podID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" containerID="23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261" exitCode=0 Feb 19 22:56:41 crc kubenswrapper[4795]: I0219 22:56:41.415418 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" event={"ID":"1847cbdb-2b75-48d9-ab0c-db5da5a236a4","Type":"ContainerDied","Data":"23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261"} Feb 19 22:56:41 crc kubenswrapper[4795]: I0219 22:56:41.415467 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" event={"ID":"1847cbdb-2b75-48d9-ab0c-db5da5a236a4","Type":"ContainerStarted","Data":"7922c7a8658b3cbc38083041c7e26029e3e327c9354e7a22ada01242a334b1d4"} Feb 19 22:56:41 crc kubenswrapper[4795]: I0219 22:56:41.418518 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"388d272a-cffa-4321-ac91-648accbf6930","Type":"ContainerStarted","Data":"1345bf8f0b6f81825f1c42cd0e670dbc13d5673dca7862d996402d988d2684ef"} Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.428352 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" event={"ID":"1847cbdb-2b75-48d9-ab0c-db5da5a236a4","Type":"ContainerStarted","Data":"dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917"} Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.429010 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.430250 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"388d272a-cffa-4321-ac91-648accbf6930","Type":"ContainerStarted","Data":"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35"} Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.430282 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"388d272a-cffa-4321-ac91-648accbf6930","Type":"ContainerStarted","Data":"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f"} Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.431944 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13c7f8fe-962d-47d0-9607-f121e0c6a38d","Type":"ContainerStarted","Data":"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14"} Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.432276 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerName="glance-httpd" containerID="cri-o://c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14" gracePeriod=30 Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.432280 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerName="glance-log" containerID="cri-o://d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77" gracePeriod=30 Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.454979 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" podStartSLOduration=3.454960967 podStartE2EDuration="3.454960967s" podCreationTimestamp="2026-02-19 22:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:56:42.448643386 +0000 UTC m=+5313.641161280" watchObservedRunningTime="2026-02-19 22:56:42.454960967 +0000 UTC m=+5313.647478831" Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.477687 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.477666715 podStartE2EDuration="3.477666715s" podCreationTimestamp="2026-02-19 22:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:56:42.475180612 +0000 UTC m=+5313.667698476" watchObservedRunningTime="2026-02-19 22:56:42.477666715 +0000 UTC m=+5313.670184579" Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.500059 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.500034894 podStartE2EDuration="3.500034894s" podCreationTimestamp="2026-02-19 22:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:56:42.493267852 +0000 UTC m=+5313.685785776" watchObservedRunningTime="2026-02-19 22:56:42.500034894 +0000 UTC m=+5313.692552758" Feb 19 22:56:42 crc kubenswrapper[4795]: I0219 22:56:42.713109 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.046790 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.196604 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pdqx\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-kube-api-access-8pdqx\") pod \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.196680 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-ceph\") pod \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.196730 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-httpd-run\") pod \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.196750 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-scripts\") pod \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.196827 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-combined-ca-bundle\") pod \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.196843 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-logs\") pod \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.196891 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-config-data\") pod \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\" (UID: \"13c7f8fe-962d-47d0-9607-f121e0c6a38d\") " Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.197333 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "13c7f8fe-962d-47d0-9607-f121e0c6a38d" (UID: "13c7f8fe-962d-47d0-9607-f121e0c6a38d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.197510 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-logs" (OuterVolumeSpecName: "logs") pod "13c7f8fe-962d-47d0-9607-f121e0c6a38d" (UID: "13c7f8fe-962d-47d0-9607-f121e0c6a38d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.202868 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-kube-api-access-8pdqx" (OuterVolumeSpecName: "kube-api-access-8pdqx") pod "13c7f8fe-962d-47d0-9607-f121e0c6a38d" (UID: "13c7f8fe-962d-47d0-9607-f121e0c6a38d"). InnerVolumeSpecName "kube-api-access-8pdqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.216763 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-ceph" (OuterVolumeSpecName: "ceph") pod "13c7f8fe-962d-47d0-9607-f121e0c6a38d" (UID: "13c7f8fe-962d-47d0-9607-f121e0c6a38d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.216888 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-scripts" (OuterVolumeSpecName: "scripts") pod "13c7f8fe-962d-47d0-9607-f121e0c6a38d" (UID: "13c7f8fe-962d-47d0-9607-f121e0c6a38d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.227297 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13c7f8fe-962d-47d0-9607-f121e0c6a38d" (UID: "13c7f8fe-962d-47d0-9607-f121e0c6a38d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.251359 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-config-data" (OuterVolumeSpecName: "config-data") pod "13c7f8fe-962d-47d0-9607-f121e0c6a38d" (UID: "13c7f8fe-962d-47d0-9607-f121e0c6a38d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.299187 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.299222 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pdqx\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-kube-api-access-8pdqx\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.299233 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/13c7f8fe-962d-47d0-9607-f121e0c6a38d-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.299240 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.299250 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.299260 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c7f8fe-962d-47d0-9607-f121e0c6a38d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.299269 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13c7f8fe-962d-47d0-9607-f121e0c6a38d-logs\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.441449 4795 generic.go:334] "Generic (PLEG): container finished" podID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerID="c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14" exitCode=0 Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.441491 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.441520 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13c7f8fe-962d-47d0-9607-f121e0c6a38d","Type":"ContainerDied","Data":"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14"} Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.441565 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13c7f8fe-962d-47d0-9607-f121e0c6a38d","Type":"ContainerDied","Data":"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77"} Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.441496 4795 generic.go:334] "Generic (PLEG): container finished" podID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerID="d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77" exitCode=143 Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.441587 4795 scope.go:117] "RemoveContainer" containerID="c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.441684 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13c7f8fe-962d-47d0-9607-f121e0c6a38d","Type":"ContainerDied","Data":"cd0b6e7125b0cc4a617c794275fd47b2d7fa67576e3ce216320de639f679c6fb"} Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.463781 4795 scope.go:117] "RemoveContainer" containerID="d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.477140 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.495291 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.526355 4795 scope.go:117] "RemoveContainer" containerID="c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14" Feb 19 22:56:43 crc kubenswrapper[4795]: E0219 22:56:43.527422 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14\": container with ID starting with c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14 not found: ID does not exist" containerID="c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.527477 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14"} err="failed to get container status \"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14\": rpc error: code = NotFound desc = could not find container \"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14\": container with ID starting with c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14 not found: ID does not exist" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.527506 4795 scope.go:117] "RemoveContainer" containerID="d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77" Feb 19 22:56:43 crc kubenswrapper[4795]: E0219 22:56:43.528134 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77\": container with ID starting with d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77 not found: ID does not exist" containerID="d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.528228 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77"} err="failed to get container status \"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77\": rpc error: code = NotFound desc = could not find container \"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77\": container with ID starting with d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77 not found: ID does not exist" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.528264 4795 scope.go:117] "RemoveContainer" containerID="c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.528626 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14"} err="failed to get container status \"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14\": rpc error: code = NotFound desc = could not find container \"c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14\": container with ID starting with c340315ba1c4cfdce5cc416d9cd197881acf99fddddd73953974503401762d14 not found: ID does not exist" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.528648 4795 scope.go:117] "RemoveContainer" containerID="d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.529045 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77"} err="failed to get container status \"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77\": rpc error: code = NotFound desc = could not find container \"d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77\": container with ID starting with d9fe5a9f4a9227e75e14084dc6966e809b6508ab2246c7b847a3dec944648a77 not found: ID does not exist" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.531902 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" path="/var/lib/kubelet/pods/13c7f8fe-962d-47d0-9607-f121e0c6a38d/volumes" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.532775 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:43 crc kubenswrapper[4795]: E0219 22:56:43.533079 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerName="glance-httpd" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.533100 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerName="glance-httpd" Feb 19 22:56:43 crc kubenswrapper[4795]: E0219 22:56:43.533119 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerName="glance-log" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.533127 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerName="glance-log" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.533378 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerName="glance-log" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.533404 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c7f8fe-962d-47d0-9607-f121e0c6a38d" containerName="glance-httpd" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.534884 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.534996 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.538755 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.706546 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-ceph\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.706661 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-scripts\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.706755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.706789 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-logs\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.706864 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.707028 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srnpd\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-kube-api-access-srnpd\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.707057 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-config-data\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.808342 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-ceph\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.808467 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-scripts\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.808540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.808591 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-logs\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.808615 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.808719 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srnpd\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-kube-api-access-srnpd\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.808765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-config-data\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.809832 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-logs\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.809856 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.812871 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-ceph\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.813355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-config-data\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.813752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-scripts\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.816298 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.829030 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srnpd\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-kube-api-access-srnpd\") pod \"glance-default-external-api-0\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " pod="openstack/glance-default-external-api-0" Feb 19 22:56:43 crc kubenswrapper[4795]: I0219 22:56:43.859953 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 22:56:44 crc kubenswrapper[4795]: I0219 22:56:44.348080 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 22:56:44 crc kubenswrapper[4795]: W0219 22:56:44.349328 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ba19509_98fd_4ae4_b9ab_673c27ab8e85.slice/crio-80a194040524fc06f644957b6fb02ec46c953aefd62778b8fadace961f88d1ac WatchSource:0}: Error finding container 80a194040524fc06f644957b6fb02ec46c953aefd62778b8fadace961f88d1ac: Status 404 returned error can't find the container with id 80a194040524fc06f644957b6fb02ec46c953aefd62778b8fadace961f88d1ac Feb 19 22:56:44 crc kubenswrapper[4795]: I0219 22:56:44.450895 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ba19509-98fd-4ae4-b9ab-673c27ab8e85","Type":"ContainerStarted","Data":"80a194040524fc06f644957b6fb02ec46c953aefd62778b8fadace961f88d1ac"} Feb 19 22:56:44 crc kubenswrapper[4795]: I0219 22:56:44.451067 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="388d272a-cffa-4321-ac91-648accbf6930" containerName="glance-log" containerID="cri-o://b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f" gracePeriod=30 Feb 19 22:56:44 crc kubenswrapper[4795]: I0219 22:56:44.451204 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="388d272a-cffa-4321-ac91-648accbf6930" containerName="glance-httpd" containerID="cri-o://20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35" gracePeriod=30 Feb 19 22:56:44 crc kubenswrapper[4795]: I0219 22:56:44.924872 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.028761 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdwdh\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-kube-api-access-kdwdh\") pod \"388d272a-cffa-4321-ac91-648accbf6930\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.028845 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-combined-ca-bundle\") pod \"388d272a-cffa-4321-ac91-648accbf6930\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.028911 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-ceph\") pod \"388d272a-cffa-4321-ac91-648accbf6930\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.028998 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-httpd-run\") pod \"388d272a-cffa-4321-ac91-648accbf6930\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.029037 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-config-data\") pod \"388d272a-cffa-4321-ac91-648accbf6930\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.029061 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-logs\") pod \"388d272a-cffa-4321-ac91-648accbf6930\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.029085 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-scripts\") pod \"388d272a-cffa-4321-ac91-648accbf6930\" (UID: \"388d272a-cffa-4321-ac91-648accbf6930\") " Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.030011 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-logs" (OuterVolumeSpecName: "logs") pod "388d272a-cffa-4321-ac91-648accbf6930" (UID: "388d272a-cffa-4321-ac91-648accbf6930"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.030320 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "388d272a-cffa-4321-ac91-648accbf6930" (UID: "388d272a-cffa-4321-ac91-648accbf6930"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.034860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-scripts" (OuterVolumeSpecName: "scripts") pod "388d272a-cffa-4321-ac91-648accbf6930" (UID: "388d272a-cffa-4321-ac91-648accbf6930"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.034954 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-ceph" (OuterVolumeSpecName: "ceph") pod "388d272a-cffa-4321-ac91-648accbf6930" (UID: "388d272a-cffa-4321-ac91-648accbf6930"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.035593 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-kube-api-access-kdwdh" (OuterVolumeSpecName: "kube-api-access-kdwdh") pod "388d272a-cffa-4321-ac91-648accbf6930" (UID: "388d272a-cffa-4321-ac91-648accbf6930"). InnerVolumeSpecName "kube-api-access-kdwdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.057900 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "388d272a-cffa-4321-ac91-648accbf6930" (UID: "388d272a-cffa-4321-ac91-648accbf6930"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.091672 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-config-data" (OuterVolumeSpecName: "config-data") pod "388d272a-cffa-4321-ac91-648accbf6930" (UID: "388d272a-cffa-4321-ac91-648accbf6930"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.130888 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdwdh\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-kube-api-access-kdwdh\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.130925 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.130934 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/388d272a-cffa-4321-ac91-648accbf6930-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.130943 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.130952 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.130959 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/388d272a-cffa-4321-ac91-648accbf6930-logs\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.130966 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/388d272a-cffa-4321-ac91-648accbf6930-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.463777 4795 generic.go:334] "Generic (PLEG): container finished" podID="388d272a-cffa-4321-ac91-648accbf6930" containerID="20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35" exitCode=0 Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.464151 4795 generic.go:334] "Generic (PLEG): container finished" podID="388d272a-cffa-4321-ac91-648accbf6930" containerID="b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f" exitCode=143 Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.464197 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.464194 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"388d272a-cffa-4321-ac91-648accbf6930","Type":"ContainerDied","Data":"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35"} Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.464278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"388d272a-cffa-4321-ac91-648accbf6930","Type":"ContainerDied","Data":"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f"} Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.464299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"388d272a-cffa-4321-ac91-648accbf6930","Type":"ContainerDied","Data":"1345bf8f0b6f81825f1c42cd0e670dbc13d5673dca7862d996402d988d2684ef"} Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.464324 4795 scope.go:117] "RemoveContainer" containerID="20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.467924 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ba19509-98fd-4ae4-b9ab-673c27ab8e85","Type":"ContainerStarted","Data":"f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648"} Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.467963 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ba19509-98fd-4ae4-b9ab-673c27ab8e85","Type":"ContainerStarted","Data":"e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58"} Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.491567 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.491542103 podStartE2EDuration="2.491542103s" podCreationTimestamp="2026-02-19 22:56:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:56:45.486096484 +0000 UTC m=+5316.678614368" watchObservedRunningTime="2026-02-19 22:56:45.491542103 +0000 UTC m=+5316.684059967" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.503063 4795 scope.go:117] "RemoveContainer" containerID="b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.509672 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.524860 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.532862 4795 scope.go:117] "RemoveContainer" containerID="20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35" Feb 19 22:56:45 crc kubenswrapper[4795]: E0219 22:56:45.533389 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35\": container with ID starting with 20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35 not found: ID does not exist" containerID="20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.533419 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35"} err="failed to get container status \"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35\": rpc error: code = NotFound desc = could not find container \"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35\": container with ID starting with 20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35 not found: ID does not exist" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.533438 4795 scope.go:117] "RemoveContainer" containerID="b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f" Feb 19 22:56:45 crc kubenswrapper[4795]: E0219 22:56:45.533937 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f\": container with ID starting with b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f not found: ID does not exist" containerID="b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.533963 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f"} err="failed to get container status \"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f\": rpc error: code = NotFound desc = could not find container \"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f\": container with ID starting with b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f not found: ID does not exist" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.533979 4795 scope.go:117] "RemoveContainer" containerID="20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.534256 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35"} err="failed to get container status \"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35\": rpc error: code = NotFound desc = could not find container \"20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35\": container with ID starting with 20a6c485ee4b259fa052bf00a6fec15b69bf0c15e0886c219363551842ab8a35 not found: ID does not exist" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.534282 4795 scope.go:117] "RemoveContainer" containerID="b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.534672 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f"} err="failed to get container status \"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f\": rpc error: code = NotFound desc = could not find container \"b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f\": container with ID starting with b7489a68eadfa92edc35cbc01e8e4e0cfcfa9cab7990a08055f841923330ab0f not found: ID does not exist" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.544089 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:45 crc kubenswrapper[4795]: E0219 22:56:45.545577 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388d272a-cffa-4321-ac91-648accbf6930" containerName="glance-log" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.545612 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="388d272a-cffa-4321-ac91-648accbf6930" containerName="glance-log" Feb 19 22:56:45 crc kubenswrapper[4795]: E0219 22:56:45.545628 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388d272a-cffa-4321-ac91-648accbf6930" containerName="glance-httpd" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.545635 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="388d272a-cffa-4321-ac91-648accbf6930" containerName="glance-httpd" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.545789 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="388d272a-cffa-4321-ac91-648accbf6930" containerName="glance-log" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.545806 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="388d272a-cffa-4321-ac91-648accbf6930" containerName="glance-httpd" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.548907 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.553620 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.554311 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.640251 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.640304 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljtbh\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-kube-api-access-ljtbh\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.640341 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.640516 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.640630 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-logs\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.640678 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.640772 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.742370 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.742462 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.742488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljtbh\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-kube-api-access-ljtbh\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.742522 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.742547 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.742577 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-logs\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.742594 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.744551 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.744943 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-logs\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.746914 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.747396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.747494 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.760865 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.770137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljtbh\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-kube-api-access-ljtbh\") pod \"glance-default-internal-api-0\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " pod="openstack/glance-default-internal-api-0" Feb 19 22:56:45 crc kubenswrapper[4795]: I0219 22:56:45.863771 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:46 crc kubenswrapper[4795]: I0219 22:56:46.409369 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 22:56:46 crc kubenswrapper[4795]: I0219 22:56:46.482056 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc","Type":"ContainerStarted","Data":"5b24832724357e9fe3cd524ab47d5c8ea237d7ac8d26ead3ec336002073cfe70"} Feb 19 22:56:47 crc kubenswrapper[4795]: I0219 22:56:47.490103 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc","Type":"ContainerStarted","Data":"1b16e4f15dda00c47ebd1d3a052f48e0eb759759e8c054d1aebe9ed9f2d0750b"} Feb 19 22:56:47 crc kubenswrapper[4795]: I0219 22:56:47.490647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc","Type":"ContainerStarted","Data":"2881a41d4e90cea66bb06cc15746e467948606aaeb4ab6378ba25eec8d520511"} Feb 19 22:56:47 crc kubenswrapper[4795]: I0219 22:56:47.517285 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.517263849 podStartE2EDuration="2.517263849s" podCreationTimestamp="2026-02-19 22:56:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:56:47.50472692 +0000 UTC m=+5318.697244794" watchObservedRunningTime="2026-02-19 22:56:47.517263849 +0000 UTC m=+5318.709781723" Feb 19 22:56:47 crc kubenswrapper[4795]: I0219 22:56:47.527204 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="388d272a-cffa-4321-ac91-648accbf6930" path="/var/lib/kubelet/pods/388d272a-cffa-4321-ac91-648accbf6930/volumes" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.087729 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.203576 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5d85df8f-clq65"] Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.203939 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" podUID="8e5a1fbd-4617-434e-8719-12b16bc88b98" containerName="dnsmasq-dns" containerID="cri-o://db3ac61fe2bbba8663fc1cafd3f78c8dde189ed879980b2ca339cc53d3ebb615" gracePeriod=10 Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.518658 4795 generic.go:334] "Generic (PLEG): container finished" podID="8e5a1fbd-4617-434e-8719-12b16bc88b98" containerID="db3ac61fe2bbba8663fc1cafd3f78c8dde189ed879980b2ca339cc53d3ebb615" exitCode=0 Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.518982 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" event={"ID":"8e5a1fbd-4617-434e-8719-12b16bc88b98","Type":"ContainerDied","Data":"db3ac61fe2bbba8663fc1cafd3f78c8dde189ed879980b2ca339cc53d3ebb615"} Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.700593 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.841091 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-nb\") pod \"8e5a1fbd-4617-434e-8719-12b16bc88b98\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.841281 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-config\") pod \"8e5a1fbd-4617-434e-8719-12b16bc88b98\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.841457 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-dns-svc\") pod \"8e5a1fbd-4617-434e-8719-12b16bc88b98\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.841581 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-sb\") pod \"8e5a1fbd-4617-434e-8719-12b16bc88b98\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.841737 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl457\" (UniqueName: \"kubernetes.io/projected/8e5a1fbd-4617-434e-8719-12b16bc88b98-kube-api-access-cl457\") pod \"8e5a1fbd-4617-434e-8719-12b16bc88b98\" (UID: \"8e5a1fbd-4617-434e-8719-12b16bc88b98\") " Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.864563 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e5a1fbd-4617-434e-8719-12b16bc88b98-kube-api-access-cl457" (OuterVolumeSpecName: "kube-api-access-cl457") pod "8e5a1fbd-4617-434e-8719-12b16bc88b98" (UID: "8e5a1fbd-4617-434e-8719-12b16bc88b98"). InnerVolumeSpecName "kube-api-access-cl457". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.884880 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8e5a1fbd-4617-434e-8719-12b16bc88b98" (UID: "8e5a1fbd-4617-434e-8719-12b16bc88b98"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.885613 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e5a1fbd-4617-434e-8719-12b16bc88b98" (UID: "8e5a1fbd-4617-434e-8719-12b16bc88b98"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.904212 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8e5a1fbd-4617-434e-8719-12b16bc88b98" (UID: "8e5a1fbd-4617-434e-8719-12b16bc88b98"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.910709 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-config" (OuterVolumeSpecName: "config") pod "8e5a1fbd-4617-434e-8719-12b16bc88b98" (UID: "8e5a1fbd-4617-434e-8719-12b16bc88b98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.943666 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.943702 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.943712 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.943720 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e5a1fbd-4617-434e-8719-12b16bc88b98-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:50 crc kubenswrapper[4795]: I0219 22:56:50.943731 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl457\" (UniqueName: \"kubernetes.io/projected/8e5a1fbd-4617-434e-8719-12b16bc88b98-kube-api-access-cl457\") on node \"crc\" DevicePath \"\"" Feb 19 22:56:51 crc kubenswrapper[4795]: I0219 22:56:51.533551 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" event={"ID":"8e5a1fbd-4617-434e-8719-12b16bc88b98","Type":"ContainerDied","Data":"7217af4cd4eb00396c1a57059f86739ac95943d868ddbc9e0af3ab209b2339ee"} Feb 19 22:56:51 crc kubenswrapper[4795]: I0219 22:56:51.533615 4795 scope.go:117] "RemoveContainer" containerID="db3ac61fe2bbba8663fc1cafd3f78c8dde189ed879980b2ca339cc53d3ebb615" Feb 19 22:56:51 crc kubenswrapper[4795]: I0219 22:56:51.533774 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5d85df8f-clq65" Feb 19 22:56:51 crc kubenswrapper[4795]: I0219 22:56:51.563805 4795 scope.go:117] "RemoveContainer" containerID="400d407c08a50e5c293b398a82df6a8f79475760e406fca1855e5558b33cb384" Feb 19 22:56:51 crc kubenswrapper[4795]: I0219 22:56:51.586194 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5d85df8f-clq65"] Feb 19 22:56:51 crc kubenswrapper[4795]: I0219 22:56:51.595185 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d5d85df8f-clq65"] Feb 19 22:56:53 crc kubenswrapper[4795]: I0219 22:56:53.527418 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e5a1fbd-4617-434e-8719-12b16bc88b98" path="/var/lib/kubelet/pods/8e5a1fbd-4617-434e-8719-12b16bc88b98/volumes" Feb 19 22:56:53 crc kubenswrapper[4795]: I0219 22:56:53.860819 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 22:56:53 crc kubenswrapper[4795]: I0219 22:56:53.861244 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 22:56:53 crc kubenswrapper[4795]: I0219 22:56:53.898759 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 22:56:53 crc kubenswrapper[4795]: I0219 22:56:53.907198 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 22:56:54 crc kubenswrapper[4795]: I0219 22:56:54.567657 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 22:56:54 crc kubenswrapper[4795]: I0219 22:56:54.567701 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 22:56:55 crc kubenswrapper[4795]: I0219 22:56:55.864920 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:55 crc kubenswrapper[4795]: I0219 22:56:55.864978 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:55 crc kubenswrapper[4795]: I0219 22:56:55.890968 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:55 crc kubenswrapper[4795]: I0219 22:56:55.900216 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:56 crc kubenswrapper[4795]: I0219 22:56:56.584257 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:56 crc kubenswrapper[4795]: I0219 22:56:56.584670 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:56 crc kubenswrapper[4795]: I0219 22:56:56.607607 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 22:56:56 crc kubenswrapper[4795]: I0219 22:56:56.607745 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 22:56:56 crc kubenswrapper[4795]: I0219 22:56:56.611544 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 22:56:58 crc kubenswrapper[4795]: I0219 22:56:58.581708 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 22:56:58 crc kubenswrapper[4795]: I0219 22:56:58.588704 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.378327 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-w9m97"] Feb 19 22:57:06 crc kubenswrapper[4795]: E0219 22:57:06.379437 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5a1fbd-4617-434e-8719-12b16bc88b98" containerName="init" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.379457 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5a1fbd-4617-434e-8719-12b16bc88b98" containerName="init" Feb 19 22:57:06 crc kubenswrapper[4795]: E0219 22:57:06.379478 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5a1fbd-4617-434e-8719-12b16bc88b98" containerName="dnsmasq-dns" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.379487 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5a1fbd-4617-434e-8719-12b16bc88b98" containerName="dnsmasq-dns" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.379703 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e5a1fbd-4617-434e-8719-12b16bc88b98" containerName="dnsmasq-dns" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.380493 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w9m97" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.408026 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-4561-account-create-update-zf5q8"] Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.409262 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.411734 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.432041 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-w9m97"] Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.438883 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4561-account-create-update-zf5q8"] Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.468379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eda8a248-0107-4d34-a02b-6dbf30972c64-operator-scripts\") pod \"placement-4561-account-create-update-zf5q8\" (UID: \"eda8a248-0107-4d34-a02b-6dbf30972c64\") " pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.468537 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm798\" (UniqueName: \"kubernetes.io/projected/eda8a248-0107-4d34-a02b-6dbf30972c64-kube-api-access-lm798\") pod \"placement-4561-account-create-update-zf5q8\" (UID: \"eda8a248-0107-4d34-a02b-6dbf30972c64\") " pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.468565 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csjxz\" (UniqueName: \"kubernetes.io/projected/fcc96fc8-80e4-4dda-af2e-91390b6af829-kube-api-access-csjxz\") pod \"placement-db-create-w9m97\" (UID: \"fcc96fc8-80e4-4dda-af2e-91390b6af829\") " pod="openstack/placement-db-create-w9m97" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.468584 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcc96fc8-80e4-4dda-af2e-91390b6af829-operator-scripts\") pod \"placement-db-create-w9m97\" (UID: \"fcc96fc8-80e4-4dda-af2e-91390b6af829\") " pod="openstack/placement-db-create-w9m97" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.570211 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm798\" (UniqueName: \"kubernetes.io/projected/eda8a248-0107-4d34-a02b-6dbf30972c64-kube-api-access-lm798\") pod \"placement-4561-account-create-update-zf5q8\" (UID: \"eda8a248-0107-4d34-a02b-6dbf30972c64\") " pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.570261 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csjxz\" (UniqueName: \"kubernetes.io/projected/fcc96fc8-80e4-4dda-af2e-91390b6af829-kube-api-access-csjxz\") pod \"placement-db-create-w9m97\" (UID: \"fcc96fc8-80e4-4dda-af2e-91390b6af829\") " pod="openstack/placement-db-create-w9m97" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.570282 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcc96fc8-80e4-4dda-af2e-91390b6af829-operator-scripts\") pod \"placement-db-create-w9m97\" (UID: \"fcc96fc8-80e4-4dda-af2e-91390b6af829\") " pod="openstack/placement-db-create-w9m97" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.570432 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eda8a248-0107-4d34-a02b-6dbf30972c64-operator-scripts\") pod \"placement-4561-account-create-update-zf5q8\" (UID: \"eda8a248-0107-4d34-a02b-6dbf30972c64\") " pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.573080 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eda8a248-0107-4d34-a02b-6dbf30972c64-operator-scripts\") pod \"placement-4561-account-create-update-zf5q8\" (UID: \"eda8a248-0107-4d34-a02b-6dbf30972c64\") " pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.574410 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcc96fc8-80e4-4dda-af2e-91390b6af829-operator-scripts\") pod \"placement-db-create-w9m97\" (UID: \"fcc96fc8-80e4-4dda-af2e-91390b6af829\") " pod="openstack/placement-db-create-w9m97" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.596013 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm798\" (UniqueName: \"kubernetes.io/projected/eda8a248-0107-4d34-a02b-6dbf30972c64-kube-api-access-lm798\") pod \"placement-4561-account-create-update-zf5q8\" (UID: \"eda8a248-0107-4d34-a02b-6dbf30972c64\") " pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.599023 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csjxz\" (UniqueName: \"kubernetes.io/projected/fcc96fc8-80e4-4dda-af2e-91390b6af829-kube-api-access-csjxz\") pod \"placement-db-create-w9m97\" (UID: \"fcc96fc8-80e4-4dda-af2e-91390b6af829\") " pod="openstack/placement-db-create-w9m97" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.701370 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w9m97" Feb 19 22:57:06 crc kubenswrapper[4795]: I0219 22:57:06.728836 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:07 crc kubenswrapper[4795]: I0219 22:57:07.143033 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-w9m97"] Feb 19 22:57:07 crc kubenswrapper[4795]: W0219 22:57:07.147620 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcc96fc8_80e4_4dda_af2e_91390b6af829.slice/crio-acdb38f0dceea7aa953861e7f6df090d7821626340d0baa54d270c8c4eabb860 WatchSource:0}: Error finding container acdb38f0dceea7aa953861e7f6df090d7821626340d0baa54d270c8c4eabb860: Status 404 returned error can't find the container with id acdb38f0dceea7aa953861e7f6df090d7821626340d0baa54d270c8c4eabb860 Feb 19 22:57:07 crc kubenswrapper[4795]: I0219 22:57:07.223635 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4561-account-create-update-zf5q8"] Feb 19 22:57:07 crc kubenswrapper[4795]: I0219 22:57:07.678612 4795 generic.go:334] "Generic (PLEG): container finished" podID="fcc96fc8-80e4-4dda-af2e-91390b6af829" containerID="a92e1a832062a3ff2c20dc8cd15ae2f139e651b81902a3e2ee439ec6e20b2123" exitCode=0 Feb 19 22:57:07 crc kubenswrapper[4795]: I0219 22:57:07.678663 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w9m97" event={"ID":"fcc96fc8-80e4-4dda-af2e-91390b6af829","Type":"ContainerDied","Data":"a92e1a832062a3ff2c20dc8cd15ae2f139e651b81902a3e2ee439ec6e20b2123"} Feb 19 22:57:07 crc kubenswrapper[4795]: I0219 22:57:07.678981 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w9m97" event={"ID":"fcc96fc8-80e4-4dda-af2e-91390b6af829","Type":"ContainerStarted","Data":"acdb38f0dceea7aa953861e7f6df090d7821626340d0baa54d270c8c4eabb860"} Feb 19 22:57:07 crc kubenswrapper[4795]: I0219 22:57:07.681213 4795 generic.go:334] "Generic (PLEG): container finished" podID="eda8a248-0107-4d34-a02b-6dbf30972c64" containerID="dbe34d14d5acf3b90d2e3b7465f76c19cc181a6a6717aade5162b1682beb16f4" exitCode=0 Feb 19 22:57:07 crc kubenswrapper[4795]: I0219 22:57:07.681240 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4561-account-create-update-zf5q8" event={"ID":"eda8a248-0107-4d34-a02b-6dbf30972c64","Type":"ContainerDied","Data":"dbe34d14d5acf3b90d2e3b7465f76c19cc181a6a6717aade5162b1682beb16f4"} Feb 19 22:57:07 crc kubenswrapper[4795]: I0219 22:57:07.681253 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4561-account-create-update-zf5q8" event={"ID":"eda8a248-0107-4d34-a02b-6dbf30972c64","Type":"ContainerStarted","Data":"8ca94684f9202aef28c8b84cf7921d5e25e8941ae1b0a05199c45c26a6f9e78a"} Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.133221 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.143710 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w9m97" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.317541 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csjxz\" (UniqueName: \"kubernetes.io/projected/fcc96fc8-80e4-4dda-af2e-91390b6af829-kube-api-access-csjxz\") pod \"fcc96fc8-80e4-4dda-af2e-91390b6af829\" (UID: \"fcc96fc8-80e4-4dda-af2e-91390b6af829\") " Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.317688 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcc96fc8-80e4-4dda-af2e-91390b6af829-operator-scripts\") pod \"fcc96fc8-80e4-4dda-af2e-91390b6af829\" (UID: \"fcc96fc8-80e4-4dda-af2e-91390b6af829\") " Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.317713 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm798\" (UniqueName: \"kubernetes.io/projected/eda8a248-0107-4d34-a02b-6dbf30972c64-kube-api-access-lm798\") pod \"eda8a248-0107-4d34-a02b-6dbf30972c64\" (UID: \"eda8a248-0107-4d34-a02b-6dbf30972c64\") " Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.317771 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eda8a248-0107-4d34-a02b-6dbf30972c64-operator-scripts\") pod \"eda8a248-0107-4d34-a02b-6dbf30972c64\" (UID: \"eda8a248-0107-4d34-a02b-6dbf30972c64\") " Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.318258 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcc96fc8-80e4-4dda-af2e-91390b6af829-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fcc96fc8-80e4-4dda-af2e-91390b6af829" (UID: "fcc96fc8-80e4-4dda-af2e-91390b6af829"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.318505 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eda8a248-0107-4d34-a02b-6dbf30972c64-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eda8a248-0107-4d34-a02b-6dbf30972c64" (UID: "eda8a248-0107-4d34-a02b-6dbf30972c64"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.323400 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcc96fc8-80e4-4dda-af2e-91390b6af829-kube-api-access-csjxz" (OuterVolumeSpecName: "kube-api-access-csjxz") pod "fcc96fc8-80e4-4dda-af2e-91390b6af829" (UID: "fcc96fc8-80e4-4dda-af2e-91390b6af829"). InnerVolumeSpecName "kube-api-access-csjxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.326336 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda8a248-0107-4d34-a02b-6dbf30972c64-kube-api-access-lm798" (OuterVolumeSpecName: "kube-api-access-lm798") pod "eda8a248-0107-4d34-a02b-6dbf30972c64" (UID: "eda8a248-0107-4d34-a02b-6dbf30972c64"). InnerVolumeSpecName "kube-api-access-lm798". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.420140 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eda8a248-0107-4d34-a02b-6dbf30972c64-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.420573 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csjxz\" (UniqueName: \"kubernetes.io/projected/fcc96fc8-80e4-4dda-af2e-91390b6af829-kube-api-access-csjxz\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.420591 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fcc96fc8-80e4-4dda-af2e-91390b6af829-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.420602 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm798\" (UniqueName: \"kubernetes.io/projected/eda8a248-0107-4d34-a02b-6dbf30972c64-kube-api-access-lm798\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.700302 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4561-account-create-update-zf5q8" event={"ID":"eda8a248-0107-4d34-a02b-6dbf30972c64","Type":"ContainerDied","Data":"8ca94684f9202aef28c8b84cf7921d5e25e8941ae1b0a05199c45c26a6f9e78a"} Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.700352 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ca94684f9202aef28c8b84cf7921d5e25e8941ae1b0a05199c45c26a6f9e78a" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.700434 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4561-account-create-update-zf5q8" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.702683 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w9m97" event={"ID":"fcc96fc8-80e4-4dda-af2e-91390b6af829","Type":"ContainerDied","Data":"acdb38f0dceea7aa953861e7f6df090d7821626340d0baa54d270c8c4eabb860"} Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.702713 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acdb38f0dceea7aa953861e7f6df090d7821626340d0baa54d270c8c4eabb860" Feb 19 22:57:09 crc kubenswrapper[4795]: I0219 22:57:09.702768 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w9m97" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.708079 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5756cc6d89-g8t6k"] Feb 19 22:57:11 crc kubenswrapper[4795]: E0219 22:57:11.709815 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda8a248-0107-4d34-a02b-6dbf30972c64" containerName="mariadb-account-create-update" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.709913 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda8a248-0107-4d34-a02b-6dbf30972c64" containerName="mariadb-account-create-update" Feb 19 22:57:11 crc kubenswrapper[4795]: E0219 22:57:11.710007 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc96fc8-80e4-4dda-af2e-91390b6af829" containerName="mariadb-database-create" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.710085 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc96fc8-80e4-4dda-af2e-91390b6af829" containerName="mariadb-database-create" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.710367 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcc96fc8-80e4-4dda-af2e-91390b6af829" containerName="mariadb-database-create" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.710440 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="eda8a248-0107-4d34-a02b-6dbf30972c64" containerName="mariadb-account-create-update" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.711660 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.735206 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5756cc6d89-g8t6k"] Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.739742 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-srnhx"] Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.742270 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.745955 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.746055 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.746187 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-lzv2r" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.772382 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-srnhx"] Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.862745 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-combined-ca-bundle\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.862790 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-sb\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.862815 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz6z5\" (UniqueName: \"kubernetes.io/projected/19cb42f3-600f-4079-9dcd-6ba8697d5778-kube-api-access-nz6z5\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.862989 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-scripts\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.863038 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7286d7ba-7f8c-4f40-a18a-d29af788c344-logs\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.863118 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-dns-svc\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.863310 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss4fc\" (UniqueName: \"kubernetes.io/projected/7286d7ba-7f8c-4f40-a18a-d29af788c344-kube-api-access-ss4fc\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.863664 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-config-data\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.863827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-nb\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.863919 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-config\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.966092 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-config-data\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.966151 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-nb\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.966570 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-config\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.967062 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-nb\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.967709 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-config\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.967790 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-combined-ca-bundle\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.967816 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-sb\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.968135 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz6z5\" (UniqueName: \"kubernetes.io/projected/19cb42f3-600f-4079-9dcd-6ba8697d5778-kube-api-access-nz6z5\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.968201 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-scripts\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.968221 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7286d7ba-7f8c-4f40-a18a-d29af788c344-logs\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.968239 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-dns-svc\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.968278 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss4fc\" (UniqueName: \"kubernetes.io/projected/7286d7ba-7f8c-4f40-a18a-d29af788c344-kube-api-access-ss4fc\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.968665 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-sb\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.969041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7286d7ba-7f8c-4f40-a18a-d29af788c344-logs\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.969241 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-dns-svc\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.973722 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-scripts\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.986084 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-config-data\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.986853 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-combined-ca-bundle\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.987471 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss4fc\" (UniqueName: \"kubernetes.io/projected/7286d7ba-7f8c-4f40-a18a-d29af788c344-kube-api-access-ss4fc\") pod \"placement-db-sync-srnhx\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:11 crc kubenswrapper[4795]: I0219 22:57:11.987655 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz6z5\" (UniqueName: \"kubernetes.io/projected/19cb42f3-600f-4079-9dcd-6ba8697d5778-kube-api-access-nz6z5\") pod \"dnsmasq-dns-5756cc6d89-g8t6k\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:12 crc kubenswrapper[4795]: I0219 22:57:12.046371 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:12 crc kubenswrapper[4795]: I0219 22:57:12.059110 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:12 crc kubenswrapper[4795]: I0219 22:57:12.536777 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5756cc6d89-g8t6k"] Feb 19 22:57:12 crc kubenswrapper[4795]: W0219 22:57:12.539578 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19cb42f3_600f_4079_9dcd_6ba8697d5778.slice/crio-1fe0f90c3062cc804bcfcc62ea300db4a0ac21188d96c43b0dd98bd184a851d6 WatchSource:0}: Error finding container 1fe0f90c3062cc804bcfcc62ea300db4a0ac21188d96c43b0dd98bd184a851d6: Status 404 returned error can't find the container with id 1fe0f90c3062cc804bcfcc62ea300db4a0ac21188d96c43b0dd98bd184a851d6 Feb 19 22:57:12 crc kubenswrapper[4795]: I0219 22:57:12.620817 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-srnhx"] Feb 19 22:57:12 crc kubenswrapper[4795]: I0219 22:57:12.736317 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-srnhx" event={"ID":"7286d7ba-7f8c-4f40-a18a-d29af788c344","Type":"ContainerStarted","Data":"39f7733084b133f8aced512072edf7f34651de80ae0434cc8b9bc380ae92dc8a"} Feb 19 22:57:12 crc kubenswrapper[4795]: I0219 22:57:12.738448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" event={"ID":"19cb42f3-600f-4079-9dcd-6ba8697d5778","Type":"ContainerStarted","Data":"1fe0f90c3062cc804bcfcc62ea300db4a0ac21188d96c43b0dd98bd184a851d6"} Feb 19 22:57:13 crc kubenswrapper[4795]: I0219 22:57:13.752713 4795 generic.go:334] "Generic (PLEG): container finished" podID="19cb42f3-600f-4079-9dcd-6ba8697d5778" containerID="d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7" exitCode=0 Feb 19 22:57:13 crc kubenswrapper[4795]: I0219 22:57:13.752812 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" event={"ID":"19cb42f3-600f-4079-9dcd-6ba8697d5778","Type":"ContainerDied","Data":"d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7"} Feb 19 22:57:13 crc kubenswrapper[4795]: I0219 22:57:13.756398 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-srnhx" event={"ID":"7286d7ba-7f8c-4f40-a18a-d29af788c344","Type":"ContainerStarted","Data":"2fce75ec54d1aa6599ee05902a3278bc712bcedfa4ab8adb5ae0a48b2003dc71"} Feb 19 22:57:13 crc kubenswrapper[4795]: I0219 22:57:13.824333 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-srnhx" podStartSLOduration=2.8243080689999998 podStartE2EDuration="2.824308069s" podCreationTimestamp="2026-02-19 22:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:57:13.797460696 +0000 UTC m=+5344.989978560" watchObservedRunningTime="2026-02-19 22:57:13.824308069 +0000 UTC m=+5345.016825933" Feb 19 22:57:14 crc kubenswrapper[4795]: I0219 22:57:14.767037 4795 generic.go:334] "Generic (PLEG): container finished" podID="7286d7ba-7f8c-4f40-a18a-d29af788c344" containerID="2fce75ec54d1aa6599ee05902a3278bc712bcedfa4ab8adb5ae0a48b2003dc71" exitCode=0 Feb 19 22:57:14 crc kubenswrapper[4795]: I0219 22:57:14.767135 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-srnhx" event={"ID":"7286d7ba-7f8c-4f40-a18a-d29af788c344","Type":"ContainerDied","Data":"2fce75ec54d1aa6599ee05902a3278bc712bcedfa4ab8adb5ae0a48b2003dc71"} Feb 19 22:57:14 crc kubenswrapper[4795]: I0219 22:57:14.774057 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" event={"ID":"19cb42f3-600f-4079-9dcd-6ba8697d5778","Type":"ContainerStarted","Data":"8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e"} Feb 19 22:57:14 crc kubenswrapper[4795]: I0219 22:57:14.775270 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:14 crc kubenswrapper[4795]: I0219 22:57:14.829603 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" podStartSLOduration=3.829572654 podStartE2EDuration="3.829572654s" podCreationTimestamp="2026-02-19 22:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:57:14.818348918 +0000 UTC m=+5346.010866792" watchObservedRunningTime="2026-02-19 22:57:14.829572654 +0000 UTC m=+5346.022090558" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.216421 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.262354 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7286d7ba-7f8c-4f40-a18a-d29af788c344-logs\") pod \"7286d7ba-7f8c-4f40-a18a-d29af788c344\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.262445 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-config-data\") pod \"7286d7ba-7f8c-4f40-a18a-d29af788c344\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.262470 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss4fc\" (UniqueName: \"kubernetes.io/projected/7286d7ba-7f8c-4f40-a18a-d29af788c344-kube-api-access-ss4fc\") pod \"7286d7ba-7f8c-4f40-a18a-d29af788c344\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.262490 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-combined-ca-bundle\") pod \"7286d7ba-7f8c-4f40-a18a-d29af788c344\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.262513 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-scripts\") pod \"7286d7ba-7f8c-4f40-a18a-d29af788c344\" (UID: \"7286d7ba-7f8c-4f40-a18a-d29af788c344\") " Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.263970 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7286d7ba-7f8c-4f40-a18a-d29af788c344-logs" (OuterVolumeSpecName: "logs") pod "7286d7ba-7f8c-4f40-a18a-d29af788c344" (UID: "7286d7ba-7f8c-4f40-a18a-d29af788c344"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.271874 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-scripts" (OuterVolumeSpecName: "scripts") pod "7286d7ba-7f8c-4f40-a18a-d29af788c344" (UID: "7286d7ba-7f8c-4f40-a18a-d29af788c344"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.273698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7286d7ba-7f8c-4f40-a18a-d29af788c344-kube-api-access-ss4fc" (OuterVolumeSpecName: "kube-api-access-ss4fc") pod "7286d7ba-7f8c-4f40-a18a-d29af788c344" (UID: "7286d7ba-7f8c-4f40-a18a-d29af788c344"). InnerVolumeSpecName "kube-api-access-ss4fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.302076 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7286d7ba-7f8c-4f40-a18a-d29af788c344" (UID: "7286d7ba-7f8c-4f40-a18a-d29af788c344"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.312505 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-config-data" (OuterVolumeSpecName: "config-data") pod "7286d7ba-7f8c-4f40-a18a-d29af788c344" (UID: "7286d7ba-7f8c-4f40-a18a-d29af788c344"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.364503 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7286d7ba-7f8c-4f40-a18a-d29af788c344-logs\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.364540 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.364550 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss4fc\" (UniqueName: \"kubernetes.io/projected/7286d7ba-7f8c-4f40-a18a-d29af788c344-kube-api-access-ss4fc\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.364562 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.364573 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7286d7ba-7f8c-4f40-a18a-d29af788c344-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.792243 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-srnhx" event={"ID":"7286d7ba-7f8c-4f40-a18a-d29af788c344","Type":"ContainerDied","Data":"39f7733084b133f8aced512072edf7f34651de80ae0434cc8b9bc380ae92dc8a"} Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.792314 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39f7733084b133f8aced512072edf7f34651de80ae0434cc8b9bc380ae92dc8a" Feb 19 22:57:16 crc kubenswrapper[4795]: I0219 22:57:16.793122 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-srnhx" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.407920 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-764895875b-czlhk"] Feb 19 22:57:17 crc kubenswrapper[4795]: E0219 22:57:17.408680 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7286d7ba-7f8c-4f40-a18a-d29af788c344" containerName="placement-db-sync" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.408700 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7286d7ba-7f8c-4f40-a18a-d29af788c344" containerName="placement-db-sync" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.408882 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7286d7ba-7f8c-4f40-a18a-d29af788c344" containerName="placement-db-sync" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.409712 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.416581 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.427984 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.428149 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-lzv2r" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.437732 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-764895875b-czlhk"] Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.523007 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-logs\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.523229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-config-data\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.523308 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-scripts\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.523337 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-combined-ca-bundle\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.523392 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjsf8\" (UniqueName: \"kubernetes.io/projected/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-kube-api-access-kjsf8\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.625304 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-config-data\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.625394 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-scripts\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.625416 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-combined-ca-bundle\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.625462 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjsf8\" (UniqueName: \"kubernetes.io/projected/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-kube-api-access-kjsf8\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.625506 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-logs\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.628577 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-logs\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.632801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-scripts\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.633304 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-combined-ca-bundle\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.648030 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-config-data\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.650974 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjsf8\" (UniqueName: \"kubernetes.io/projected/f4bb335d-ad73-403a-a25f-8e6f33f60ecb-kube-api-access-kjsf8\") pod \"placement-764895875b-czlhk\" (UID: \"f4bb335d-ad73-403a-a25f-8e6f33f60ecb\") " pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:17 crc kubenswrapper[4795]: I0219 22:57:17.734860 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:18 crc kubenswrapper[4795]: I0219 22:57:18.236842 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-764895875b-czlhk"] Feb 19 22:57:18 crc kubenswrapper[4795]: I0219 22:57:18.820002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-764895875b-czlhk" event={"ID":"f4bb335d-ad73-403a-a25f-8e6f33f60ecb","Type":"ContainerStarted","Data":"9476662e8a510c04df37122068019e7f7875343ef01264ebaaa67abfb0d911b4"} Feb 19 22:57:18 crc kubenswrapper[4795]: I0219 22:57:18.820343 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-764895875b-czlhk" event={"ID":"f4bb335d-ad73-403a-a25f-8e6f33f60ecb","Type":"ContainerStarted","Data":"9d11f5865edf18cb515d62ead7b9e3a7372f85f4e5b64b2b743f8320cebeb94c"} Feb 19 22:57:18 crc kubenswrapper[4795]: I0219 22:57:18.820481 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:18 crc kubenswrapper[4795]: I0219 22:57:18.820499 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-764895875b-czlhk" event={"ID":"f4bb335d-ad73-403a-a25f-8e6f33f60ecb","Type":"ContainerStarted","Data":"612dc9b36f7f89520aa4f1d4e17d7807b27e3c7ad88948a097356a1f7eb1479f"} Feb 19 22:57:18 crc kubenswrapper[4795]: I0219 22:57:18.844613 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-764895875b-czlhk" podStartSLOduration=1.844594591 podStartE2EDuration="1.844594591s" podCreationTimestamp="2026-02-19 22:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:57:18.84141541 +0000 UTC m=+5350.033933294" watchObservedRunningTime="2026-02-19 22:57:18.844594591 +0000 UTC m=+5350.037112465" Feb 19 22:57:19 crc kubenswrapper[4795]: I0219 22:57:19.831593 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.048341 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.124798 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-679b7b556f-vb5wj"] Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.125598 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" podUID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" containerName="dnsmasq-dns" containerID="cri-o://dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917" gracePeriod=10 Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.581778 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.624541 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-nb\") pod \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.624683 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-dns-svc\") pod \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.624728 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-config\") pod \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.624752 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmn7q\" (UniqueName: \"kubernetes.io/projected/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-kube-api-access-bmn7q\") pod \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.624806 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-sb\") pod \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\" (UID: \"1847cbdb-2b75-48d9-ab0c-db5da5a236a4\") " Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.632300 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-kube-api-access-bmn7q" (OuterVolumeSpecName: "kube-api-access-bmn7q") pod "1847cbdb-2b75-48d9-ab0c-db5da5a236a4" (UID: "1847cbdb-2b75-48d9-ab0c-db5da5a236a4"). InnerVolumeSpecName "kube-api-access-bmn7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.683855 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-config" (OuterVolumeSpecName: "config") pod "1847cbdb-2b75-48d9-ab0c-db5da5a236a4" (UID: "1847cbdb-2b75-48d9-ab0c-db5da5a236a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.686693 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1847cbdb-2b75-48d9-ab0c-db5da5a236a4" (UID: "1847cbdb-2b75-48d9-ab0c-db5da5a236a4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.689523 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1847cbdb-2b75-48d9-ab0c-db5da5a236a4" (UID: "1847cbdb-2b75-48d9-ab0c-db5da5a236a4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.691516 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1847cbdb-2b75-48d9-ab0c-db5da5a236a4" (UID: "1847cbdb-2b75-48d9-ab0c-db5da5a236a4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.727017 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.727376 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.727388 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmn7q\" (UniqueName: \"kubernetes.io/projected/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-kube-api-access-bmn7q\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.727400 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.727409 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1847cbdb-2b75-48d9-ab0c-db5da5a236a4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.856257 4795 generic.go:334] "Generic (PLEG): container finished" podID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" containerID="dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917" exitCode=0 Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.856306 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" event={"ID":"1847cbdb-2b75-48d9-ab0c-db5da5a236a4","Type":"ContainerDied","Data":"dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917"} Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.856336 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" event={"ID":"1847cbdb-2b75-48d9-ab0c-db5da5a236a4","Type":"ContainerDied","Data":"7922c7a8658b3cbc38083041c7e26029e3e327c9354e7a22ada01242a334b1d4"} Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.856353 4795 scope.go:117] "RemoveContainer" containerID="dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.856427 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679b7b556f-vb5wj" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.879484 4795 scope.go:117] "RemoveContainer" containerID="23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.904264 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-679b7b556f-vb5wj"] Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.910495 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-679b7b556f-vb5wj"] Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.919668 4795 scope.go:117] "RemoveContainer" containerID="dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917" Feb 19 22:57:22 crc kubenswrapper[4795]: E0219 22:57:22.920134 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917\": container with ID starting with dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917 not found: ID does not exist" containerID="dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.920178 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917"} err="failed to get container status \"dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917\": rpc error: code = NotFound desc = could not find container \"dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917\": container with ID starting with dc882b0d9e65af862350aae0c80a9f7a4b7f8f9b438717336c96a4725a891917 not found: ID does not exist" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.920200 4795 scope.go:117] "RemoveContainer" containerID="23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261" Feb 19 22:57:22 crc kubenswrapper[4795]: E0219 22:57:22.920791 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261\": container with ID starting with 23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261 not found: ID does not exist" containerID="23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261" Feb 19 22:57:22 crc kubenswrapper[4795]: I0219 22:57:22.920817 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261"} err="failed to get container status \"23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261\": rpc error: code = NotFound desc = could not find container \"23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261\": container with ID starting with 23858aaac28bb5c70674b0ae3bbc7222b23e412fd7f763759a9bb0ad432d3261 not found: ID does not exist" Feb 19 22:57:23 crc kubenswrapper[4795]: I0219 22:57:23.524821 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" path="/var/lib/kubelet/pods/1847cbdb-2b75-48d9-ab0c-db5da5a236a4/volumes" Feb 19 22:57:48 crc kubenswrapper[4795]: I0219 22:57:48.688636 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-764895875b-czlhk" Feb 19 22:57:49 crc kubenswrapper[4795]: I0219 22:57:49.691734 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-764895875b-czlhk" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.447007 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-qp45n"] Feb 19 22:58:13 crc kubenswrapper[4795]: E0219 22:58:13.447875 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" containerName="dnsmasq-dns" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.447887 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" containerName="dnsmasq-dns" Feb 19 22:58:13 crc kubenswrapper[4795]: E0219 22:58:13.447913 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" containerName="init" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.447919 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" containerName="init" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.448069 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1847cbdb-2b75-48d9-ab0c-db5da5a236a4" containerName="dnsmasq-dns" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.448643 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.462720 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qp45n"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.545076 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-x75sd"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.546487 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.562235 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-x75sd"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.588859 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65k7f\" (UniqueName: \"kubernetes.io/projected/c14f4993-80e4-4fbf-a719-22f17750811b-kube-api-access-65k7f\") pod \"nova-api-db-create-qp45n\" (UID: \"c14f4993-80e4-4fbf-a719-22f17750811b\") " pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.589039 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14f4993-80e4-4fbf-a719-22f17750811b-operator-scripts\") pod \"nova-api-db-create-qp45n\" (UID: \"c14f4993-80e4-4fbf-a719-22f17750811b\") " pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.658573 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wj996"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.659694 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.668752 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c94f-account-create-update-rqkwd"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.670309 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.672013 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.681239 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wj996"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.690371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv5w2\" (UniqueName: \"kubernetes.io/projected/41276d39-878a-4ed2-879b-2a053340874e-kube-api-access-mv5w2\") pod \"nova-cell0-db-create-x75sd\" (UID: \"41276d39-878a-4ed2-879b-2a053340874e\") " pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.690436 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14f4993-80e4-4fbf-a719-22f17750811b-operator-scripts\") pod \"nova-api-db-create-qp45n\" (UID: \"c14f4993-80e4-4fbf-a719-22f17750811b\") " pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.690516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65k7f\" (UniqueName: \"kubernetes.io/projected/c14f4993-80e4-4fbf-a719-22f17750811b-kube-api-access-65k7f\") pod \"nova-api-db-create-qp45n\" (UID: \"c14f4993-80e4-4fbf-a719-22f17750811b\") " pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.690550 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41276d39-878a-4ed2-879b-2a053340874e-operator-scripts\") pod \"nova-cell0-db-create-x75sd\" (UID: \"41276d39-878a-4ed2-879b-2a053340874e\") " pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.693825 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14f4993-80e4-4fbf-a719-22f17750811b-operator-scripts\") pod \"nova-api-db-create-qp45n\" (UID: \"c14f4993-80e4-4fbf-a719-22f17750811b\") " pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.705783 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c94f-account-create-update-rqkwd"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.715389 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65k7f\" (UniqueName: \"kubernetes.io/projected/c14f4993-80e4-4fbf-a719-22f17750811b-kube-api-access-65k7f\") pod \"nova-api-db-create-qp45n\" (UID: \"c14f4993-80e4-4fbf-a719-22f17750811b\") " pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.766363 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.791609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pdvg\" (UniqueName: \"kubernetes.io/projected/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-kube-api-access-8pdvg\") pod \"nova-cell1-db-create-wj996\" (UID: \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\") " pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.791669 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41276d39-878a-4ed2-879b-2a053340874e-operator-scripts\") pod \"nova-cell0-db-create-x75sd\" (UID: \"41276d39-878a-4ed2-879b-2a053340874e\") " pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.791763 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-operator-scripts\") pod \"nova-cell1-db-create-wj996\" (UID: \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\") " pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.791817 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlgkl\" (UniqueName: \"kubernetes.io/projected/bd4e5010-15f4-499e-8279-9a1b814b5490-kube-api-access-xlgkl\") pod \"nova-api-c94f-account-create-update-rqkwd\" (UID: \"bd4e5010-15f4-499e-8279-9a1b814b5490\") " pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.791846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv5w2\" (UniqueName: \"kubernetes.io/projected/41276d39-878a-4ed2-879b-2a053340874e-kube-api-access-mv5w2\") pod \"nova-cell0-db-create-x75sd\" (UID: \"41276d39-878a-4ed2-879b-2a053340874e\") " pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.791888 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd4e5010-15f4-499e-8279-9a1b814b5490-operator-scripts\") pod \"nova-api-c94f-account-create-update-rqkwd\" (UID: \"bd4e5010-15f4-499e-8279-9a1b814b5490\") " pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.792657 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41276d39-878a-4ed2-879b-2a053340874e-operator-scripts\") pod \"nova-cell0-db-create-x75sd\" (UID: \"41276d39-878a-4ed2-879b-2a053340874e\") " pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.813525 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv5w2\" (UniqueName: \"kubernetes.io/projected/41276d39-878a-4ed2-879b-2a053340874e-kube-api-access-mv5w2\") pod \"nova-cell0-db-create-x75sd\" (UID: \"41276d39-878a-4ed2-879b-2a053340874e\") " pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.862395 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.895617 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-93a1-account-create-update-fsvms"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.897819 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.898467 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd4e5010-15f4-499e-8279-9a1b814b5490-operator-scripts\") pod \"nova-api-c94f-account-create-update-rqkwd\" (UID: \"bd4e5010-15f4-499e-8279-9a1b814b5490\") " pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.898613 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pdvg\" (UniqueName: \"kubernetes.io/projected/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-kube-api-access-8pdvg\") pod \"nova-cell1-db-create-wj996\" (UID: \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\") " pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.898780 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-operator-scripts\") pod \"nova-cell1-db-create-wj996\" (UID: \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\") " pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.898934 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlgkl\" (UniqueName: \"kubernetes.io/projected/bd4e5010-15f4-499e-8279-9a1b814b5490-kube-api-access-xlgkl\") pod \"nova-api-c94f-account-create-update-rqkwd\" (UID: \"bd4e5010-15f4-499e-8279-9a1b814b5490\") " pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.900709 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.902707 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-operator-scripts\") pod \"nova-cell1-db-create-wj996\" (UID: \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\") " pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.918009 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-93a1-account-create-update-fsvms"] Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.921837 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd4e5010-15f4-499e-8279-9a1b814b5490-operator-scripts\") pod \"nova-api-c94f-account-create-update-rqkwd\" (UID: \"bd4e5010-15f4-499e-8279-9a1b814b5490\") " pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.926510 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlgkl\" (UniqueName: \"kubernetes.io/projected/bd4e5010-15f4-499e-8279-9a1b814b5490-kube-api-access-xlgkl\") pod \"nova-api-c94f-account-create-update-rqkwd\" (UID: \"bd4e5010-15f4-499e-8279-9a1b814b5490\") " pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.930257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pdvg\" (UniqueName: \"kubernetes.io/projected/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-kube-api-access-8pdvg\") pod \"nova-cell1-db-create-wj996\" (UID: \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\") " pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.978353 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:13 crc kubenswrapper[4795]: I0219 22:58:13.988523 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.002314 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32478c4a-a97f-4fd3-84f0-a3c221beefe9-operator-scripts\") pod \"nova-cell0-93a1-account-create-update-fsvms\" (UID: \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\") " pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.002355 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdgwq\" (UniqueName: \"kubernetes.io/projected/32478c4a-a97f-4fd3-84f0-a3c221beefe9-kube-api-access-pdgwq\") pod \"nova-cell0-93a1-account-create-update-fsvms\" (UID: \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\") " pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.064566 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-72a3-account-create-update-bfhbs"] Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.066940 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.069227 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.078094 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-72a3-account-create-update-bfhbs"] Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.104910 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32478c4a-a97f-4fd3-84f0-a3c221beefe9-operator-scripts\") pod \"nova-cell0-93a1-account-create-update-fsvms\" (UID: \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\") " pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.104958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdgwq\" (UniqueName: \"kubernetes.io/projected/32478c4a-a97f-4fd3-84f0-a3c221beefe9-kube-api-access-pdgwq\") pod \"nova-cell0-93a1-account-create-update-fsvms\" (UID: \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\") " pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.106041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32478c4a-a97f-4fd3-84f0-a3c221beefe9-operator-scripts\") pod \"nova-cell0-93a1-account-create-update-fsvms\" (UID: \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\") " pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.140298 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdgwq\" (UniqueName: \"kubernetes.io/projected/32478c4a-a97f-4fd3-84f0-a3c221beefe9-kube-api-access-pdgwq\") pod \"nova-cell0-93a1-account-create-update-fsvms\" (UID: \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\") " pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.206915 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f38e11-ea05-447d-8564-117c0f589d88-operator-scripts\") pod \"nova-cell1-72a3-account-create-update-bfhbs\" (UID: \"b6f38e11-ea05-447d-8564-117c0f589d88\") " pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.207298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sctl\" (UniqueName: \"kubernetes.io/projected/b6f38e11-ea05-447d-8564-117c0f589d88-kube-api-access-6sctl\") pod \"nova-cell1-72a3-account-create-update-bfhbs\" (UID: \"b6f38e11-ea05-447d-8564-117c0f589d88\") " pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.236203 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qp45n"] Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.279922 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.308601 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f38e11-ea05-447d-8564-117c0f589d88-operator-scripts\") pod \"nova-cell1-72a3-account-create-update-bfhbs\" (UID: \"b6f38e11-ea05-447d-8564-117c0f589d88\") " pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.308706 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sctl\" (UniqueName: \"kubernetes.io/projected/b6f38e11-ea05-447d-8564-117c0f589d88-kube-api-access-6sctl\") pod \"nova-cell1-72a3-account-create-update-bfhbs\" (UID: \"b6f38e11-ea05-447d-8564-117c0f589d88\") " pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.309712 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f38e11-ea05-447d-8564-117c0f589d88-operator-scripts\") pod \"nova-cell1-72a3-account-create-update-bfhbs\" (UID: \"b6f38e11-ea05-447d-8564-117c0f589d88\") " pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.326116 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sctl\" (UniqueName: \"kubernetes.io/projected/b6f38e11-ea05-447d-8564-117c0f589d88-kube-api-access-6sctl\") pod \"nova-cell1-72a3-account-create-update-bfhbs\" (UID: \"b6f38e11-ea05-447d-8564-117c0f589d88\") " pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.348395 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wj996"] Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.351528 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qp45n" event={"ID":"c14f4993-80e4-4fbf-a719-22f17750811b","Type":"ContainerStarted","Data":"8451213e68f88b50acd285c2a04d0acc0ec369b3f0e6b3da41528231547370ea"} Feb 19 22:58:14 crc kubenswrapper[4795]: W0219 22:58:14.356066 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f811fa4_8fb3_4adc_a9a8_6539dc03494c.slice/crio-60c4b3b880dd656d352efc165eef512c47015e56a27562cc52b105ed22f11634 WatchSource:0}: Error finding container 60c4b3b880dd656d352efc165eef512c47015e56a27562cc52b105ed22f11634: Status 404 returned error can't find the container with id 60c4b3b880dd656d352efc165eef512c47015e56a27562cc52b105ed22f11634 Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.370933 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-x75sd"] Feb 19 22:58:14 crc kubenswrapper[4795]: W0219 22:58:14.376903 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41276d39_878a_4ed2_879b_2a053340874e.slice/crio-efd278d8aeafc00856f0283302cda53cec21bb4e4b76e85ab39284420a8b19de WatchSource:0}: Error finding container efd278d8aeafc00856f0283302cda53cec21bb4e4b76e85ab39284420a8b19de: Status 404 returned error can't find the container with id efd278d8aeafc00856f0283302cda53cec21bb4e4b76e85ab39284420a8b19de Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.393775 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.492123 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c94f-account-create-update-rqkwd"] Feb 19 22:58:14 crc kubenswrapper[4795]: W0219 22:58:14.496977 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd4e5010_15f4_499e_8279_9a1b814b5490.slice/crio-74fe95b965d4f42b627e8a59a18179c1dc0815ee781cc682da8c01cc899b82ae WatchSource:0}: Error finding container 74fe95b965d4f42b627e8a59a18179c1dc0815ee781cc682da8c01cc899b82ae: Status 404 returned error can't find the container with id 74fe95b965d4f42b627e8a59a18179c1dc0815ee781cc682da8c01cc899b82ae Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.726622 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-93a1-account-create-update-fsvms"] Feb 19 22:58:14 crc kubenswrapper[4795]: W0219 22:58:14.736690 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32478c4a_a97f_4fd3_84f0_a3c221beefe9.slice/crio-71d309d67a45889afd20dc4c0e2cf14a0a5ee60ed196b26ebb504301dee4774c WatchSource:0}: Error finding container 71d309d67a45889afd20dc4c0e2cf14a0a5ee60ed196b26ebb504301dee4774c: Status 404 returned error can't find the container with id 71d309d67a45889afd20dc4c0e2cf14a0a5ee60ed196b26ebb504301dee4774c Feb 19 22:58:14 crc kubenswrapper[4795]: I0219 22:58:14.886070 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-72a3-account-create-update-bfhbs"] Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.359149 4795 generic.go:334] "Generic (PLEG): container finished" podID="41276d39-878a-4ed2-879b-2a053340874e" containerID="aa245a9fc5b7b00ccfbfeb6940d75331800d0a653cedd61f24d6b1162fb6f41d" exitCode=0 Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.359224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-x75sd" event={"ID":"41276d39-878a-4ed2-879b-2a053340874e","Type":"ContainerDied","Data":"aa245a9fc5b7b00ccfbfeb6940d75331800d0a653cedd61f24d6b1162fb6f41d"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.359293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-x75sd" event={"ID":"41276d39-878a-4ed2-879b-2a053340874e","Type":"ContainerStarted","Data":"efd278d8aeafc00856f0283302cda53cec21bb4e4b76e85ab39284420a8b19de"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.361337 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" event={"ID":"32478c4a-a97f-4fd3-84f0-a3c221beefe9","Type":"ContainerStarted","Data":"886bef23bb00ea44b033b4812e88d91f6efe4ec5933a69343ea4b9bc4fed7503"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.361372 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" event={"ID":"32478c4a-a97f-4fd3-84f0-a3c221beefe9","Type":"ContainerStarted","Data":"71d309d67a45889afd20dc4c0e2cf14a0a5ee60ed196b26ebb504301dee4774c"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.363020 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" event={"ID":"b6f38e11-ea05-447d-8564-117c0f589d88","Type":"ContainerStarted","Data":"0a09a6dbfb16ba30386d5f0a4f52128dc79107a79561ffd233f9f60e4e73100d"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.363065 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" event={"ID":"b6f38e11-ea05-447d-8564-117c0f589d88","Type":"ContainerStarted","Data":"b2256e32f6f1b01f32e54b251404e523a23094a46721ff340018bd197207ffa0"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.364593 4795 generic.go:334] "Generic (PLEG): container finished" podID="c14f4993-80e4-4fbf-a719-22f17750811b" containerID="00e59b7398cdff96aaf8a625c69d45b06909964fc853d0d0c0e9c6a12321c27b" exitCode=0 Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.364666 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qp45n" event={"ID":"c14f4993-80e4-4fbf-a719-22f17750811b","Type":"ContainerDied","Data":"00e59b7398cdff96aaf8a625c69d45b06909964fc853d0d0c0e9c6a12321c27b"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.366200 4795 generic.go:334] "Generic (PLEG): container finished" podID="bd4e5010-15f4-499e-8279-9a1b814b5490" containerID="4ee5bc336806db9668bb12f08ece22f6cb10a6fe341f3f2abd0e9309ef59d764" exitCode=0 Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.366265 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c94f-account-create-update-rqkwd" event={"ID":"bd4e5010-15f4-499e-8279-9a1b814b5490","Type":"ContainerDied","Data":"4ee5bc336806db9668bb12f08ece22f6cb10a6fe341f3f2abd0e9309ef59d764"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.366313 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c94f-account-create-update-rqkwd" event={"ID":"bd4e5010-15f4-499e-8279-9a1b814b5490","Type":"ContainerStarted","Data":"74fe95b965d4f42b627e8a59a18179c1dc0815ee781cc682da8c01cc899b82ae"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.368252 4795 generic.go:334] "Generic (PLEG): container finished" podID="6f811fa4-8fb3-4adc-a9a8-6539dc03494c" containerID="2ef66b4d7c2d7c4390b233ae4586dc431786dee8c34b27c0308b9e8e392e9643" exitCode=0 Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.368297 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wj996" event={"ID":"6f811fa4-8fb3-4adc-a9a8-6539dc03494c","Type":"ContainerDied","Data":"2ef66b4d7c2d7c4390b233ae4586dc431786dee8c34b27c0308b9e8e392e9643"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.368317 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wj996" event={"ID":"6f811fa4-8fb3-4adc-a9a8-6539dc03494c","Type":"ContainerStarted","Data":"60c4b3b880dd656d352efc165eef512c47015e56a27562cc52b105ed22f11634"} Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.392681 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" podStartSLOduration=2.392637781 podStartE2EDuration="2.392637781s" podCreationTimestamp="2026-02-19 22:58:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:15.391597885 +0000 UTC m=+5406.584115749" watchObservedRunningTime="2026-02-19 22:58:15.392637781 +0000 UTC m=+5406.585155665" Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.428285 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" podStartSLOduration=1.428261558 podStartE2EDuration="1.428261558s" podCreationTimestamp="2026-02-19 22:58:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:15.424465681 +0000 UTC m=+5406.616983555" watchObservedRunningTime="2026-02-19 22:58:15.428261558 +0000 UTC m=+5406.620779432" Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.436537 4795 scope.go:117] "RemoveContainer" containerID="f513386e45a86e8c71f0247f1dc15d3e271452b3797cb4850f3ce460e73e544e" Feb 19 22:58:15 crc kubenswrapper[4795]: I0219 22:58:15.531728 4795 scope.go:117] "RemoveContainer" containerID="ddf33d9929aa340d91607be45827bb09c61b346f1c01e148af46a4a7d07c2f45" Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.377775 4795 generic.go:334] "Generic (PLEG): container finished" podID="b6f38e11-ea05-447d-8564-117c0f589d88" containerID="0a09a6dbfb16ba30386d5f0a4f52128dc79107a79561ffd233f9f60e4e73100d" exitCode=0 Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.377986 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" event={"ID":"b6f38e11-ea05-447d-8564-117c0f589d88","Type":"ContainerDied","Data":"0a09a6dbfb16ba30386d5f0a4f52128dc79107a79561ffd233f9f60e4e73100d"} Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.381711 4795 generic.go:334] "Generic (PLEG): container finished" podID="32478c4a-a97f-4fd3-84f0-a3c221beefe9" containerID="886bef23bb00ea44b033b4812e88d91f6efe4ec5933a69343ea4b9bc4fed7503" exitCode=0 Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.381894 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" event={"ID":"32478c4a-a97f-4fd3-84f0-a3c221beefe9","Type":"ContainerDied","Data":"886bef23bb00ea44b033b4812e88d91f6efe4ec5933a69343ea4b9bc4fed7503"} Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.807183 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.853204 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-operator-scripts\") pod \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\" (UID: \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\") " Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.853375 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pdvg\" (UniqueName: \"kubernetes.io/projected/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-kube-api-access-8pdvg\") pod \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\" (UID: \"6f811fa4-8fb3-4adc-a9a8-6539dc03494c\") " Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.857899 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f811fa4-8fb3-4adc-a9a8-6539dc03494c" (UID: "6f811fa4-8fb3-4adc-a9a8-6539dc03494c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.879526 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-kube-api-access-8pdvg" (OuterVolumeSpecName: "kube-api-access-8pdvg") pod "6f811fa4-8fb3-4adc-a9a8-6539dc03494c" (UID: "6f811fa4-8fb3-4adc-a9a8-6539dc03494c"). InnerVolumeSpecName "kube-api-access-8pdvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.935045 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.942514 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.949447 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.963521 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:16 crc kubenswrapper[4795]: I0219 22:58:16.964010 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pdvg\" (UniqueName: \"kubernetes.io/projected/6f811fa4-8fb3-4adc-a9a8-6539dc03494c-kube-api-access-8pdvg\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.064850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv5w2\" (UniqueName: \"kubernetes.io/projected/41276d39-878a-4ed2-879b-2a053340874e-kube-api-access-mv5w2\") pod \"41276d39-878a-4ed2-879b-2a053340874e\" (UID: \"41276d39-878a-4ed2-879b-2a053340874e\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.064931 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65k7f\" (UniqueName: \"kubernetes.io/projected/c14f4993-80e4-4fbf-a719-22f17750811b-kube-api-access-65k7f\") pod \"c14f4993-80e4-4fbf-a719-22f17750811b\" (UID: \"c14f4993-80e4-4fbf-a719-22f17750811b\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.065131 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd4e5010-15f4-499e-8279-9a1b814b5490-operator-scripts\") pod \"bd4e5010-15f4-499e-8279-9a1b814b5490\" (UID: \"bd4e5010-15f4-499e-8279-9a1b814b5490\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.065231 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41276d39-878a-4ed2-879b-2a053340874e-operator-scripts\") pod \"41276d39-878a-4ed2-879b-2a053340874e\" (UID: \"41276d39-878a-4ed2-879b-2a053340874e\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.065249 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlgkl\" (UniqueName: \"kubernetes.io/projected/bd4e5010-15f4-499e-8279-9a1b814b5490-kube-api-access-xlgkl\") pod \"bd4e5010-15f4-499e-8279-9a1b814b5490\" (UID: \"bd4e5010-15f4-499e-8279-9a1b814b5490\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.065291 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14f4993-80e4-4fbf-a719-22f17750811b-operator-scripts\") pod \"c14f4993-80e4-4fbf-a719-22f17750811b\" (UID: \"c14f4993-80e4-4fbf-a719-22f17750811b\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.065804 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41276d39-878a-4ed2-879b-2a053340874e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41276d39-878a-4ed2-879b-2a053340874e" (UID: "41276d39-878a-4ed2-879b-2a053340874e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.065875 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd4e5010-15f4-499e-8279-9a1b814b5490-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd4e5010-15f4-499e-8279-9a1b814b5490" (UID: "bd4e5010-15f4-499e-8279-9a1b814b5490"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.065975 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c14f4993-80e4-4fbf-a719-22f17750811b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c14f4993-80e4-4fbf-a719-22f17750811b" (UID: "c14f4993-80e4-4fbf-a719-22f17750811b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.068510 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c14f4993-80e4-4fbf-a719-22f17750811b-kube-api-access-65k7f" (OuterVolumeSpecName: "kube-api-access-65k7f") pod "c14f4993-80e4-4fbf-a719-22f17750811b" (UID: "c14f4993-80e4-4fbf-a719-22f17750811b"). InnerVolumeSpecName "kube-api-access-65k7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.069287 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd4e5010-15f4-499e-8279-9a1b814b5490-kube-api-access-xlgkl" (OuterVolumeSpecName: "kube-api-access-xlgkl") pod "bd4e5010-15f4-499e-8279-9a1b814b5490" (UID: "bd4e5010-15f4-499e-8279-9a1b814b5490"). InnerVolumeSpecName "kube-api-access-xlgkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.070757 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41276d39-878a-4ed2-879b-2a053340874e-kube-api-access-mv5w2" (OuterVolumeSpecName: "kube-api-access-mv5w2") pod "41276d39-878a-4ed2-879b-2a053340874e" (UID: "41276d39-878a-4ed2-879b-2a053340874e"). InnerVolumeSpecName "kube-api-access-mv5w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.167249 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41276d39-878a-4ed2-879b-2a053340874e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.167286 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlgkl\" (UniqueName: \"kubernetes.io/projected/bd4e5010-15f4-499e-8279-9a1b814b5490-kube-api-access-xlgkl\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.167298 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c14f4993-80e4-4fbf-a719-22f17750811b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.167307 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv5w2\" (UniqueName: \"kubernetes.io/projected/41276d39-878a-4ed2-879b-2a053340874e-kube-api-access-mv5w2\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.167316 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65k7f\" (UniqueName: \"kubernetes.io/projected/c14f4993-80e4-4fbf-a719-22f17750811b-kube-api-access-65k7f\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.167324 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd4e5010-15f4-499e-8279-9a1b814b5490-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.394891 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-x75sd" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.394895 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-x75sd" event={"ID":"41276d39-878a-4ed2-879b-2a053340874e","Type":"ContainerDied","Data":"efd278d8aeafc00856f0283302cda53cec21bb4e4b76e85ab39284420a8b19de"} Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.394948 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efd278d8aeafc00856f0283302cda53cec21bb4e4b76e85ab39284420a8b19de" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.396763 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qp45n" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.396767 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qp45n" event={"ID":"c14f4993-80e4-4fbf-a719-22f17750811b","Type":"ContainerDied","Data":"8451213e68f88b50acd285c2a04d0acc0ec369b3f0e6b3da41528231547370ea"} Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.396809 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8451213e68f88b50acd285c2a04d0acc0ec369b3f0e6b3da41528231547370ea" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.399657 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c94f-account-create-update-rqkwd" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.399650 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c94f-account-create-update-rqkwd" event={"ID":"bd4e5010-15f4-499e-8279-9a1b814b5490","Type":"ContainerDied","Data":"74fe95b965d4f42b627e8a59a18179c1dc0815ee781cc682da8c01cc899b82ae"} Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.400049 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74fe95b965d4f42b627e8a59a18179c1dc0815ee781cc682da8c01cc899b82ae" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.401541 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wj996" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.403294 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wj996" event={"ID":"6f811fa4-8fb3-4adc-a9a8-6539dc03494c","Type":"ContainerDied","Data":"60c4b3b880dd656d352efc165eef512c47015e56a27562cc52b105ed22f11634"} Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.403500 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60c4b3b880dd656d352efc165eef512c47015e56a27562cc52b105ed22f11634" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.695502 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.778561 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32478c4a-a97f-4fd3-84f0-a3c221beefe9-operator-scripts\") pod \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\" (UID: \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.778737 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdgwq\" (UniqueName: \"kubernetes.io/projected/32478c4a-a97f-4fd3-84f0-a3c221beefe9-kube-api-access-pdgwq\") pod \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\" (UID: \"32478c4a-a97f-4fd3-84f0-a3c221beefe9\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.779107 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32478c4a-a97f-4fd3-84f0-a3c221beefe9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32478c4a-a97f-4fd3-84f0-a3c221beefe9" (UID: "32478c4a-a97f-4fd3-84f0-a3c221beefe9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.779237 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32478c4a-a97f-4fd3-84f0-a3c221beefe9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.784280 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32478c4a-a97f-4fd3-84f0-a3c221beefe9-kube-api-access-pdgwq" (OuterVolumeSpecName: "kube-api-access-pdgwq") pod "32478c4a-a97f-4fd3-84f0-a3c221beefe9" (UID: "32478c4a-a97f-4fd3-84f0-a3c221beefe9"). InnerVolumeSpecName "kube-api-access-pdgwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.853911 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.881209 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdgwq\" (UniqueName: \"kubernetes.io/projected/32478c4a-a97f-4fd3-84f0-a3c221beefe9-kube-api-access-pdgwq\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.982761 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f38e11-ea05-447d-8564-117c0f589d88-operator-scripts\") pod \"b6f38e11-ea05-447d-8564-117c0f589d88\" (UID: \"b6f38e11-ea05-447d-8564-117c0f589d88\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.982891 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sctl\" (UniqueName: \"kubernetes.io/projected/b6f38e11-ea05-447d-8564-117c0f589d88-kube-api-access-6sctl\") pod \"b6f38e11-ea05-447d-8564-117c0f589d88\" (UID: \"b6f38e11-ea05-447d-8564-117c0f589d88\") " Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.983536 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6f38e11-ea05-447d-8564-117c0f589d88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6f38e11-ea05-447d-8564-117c0f589d88" (UID: "b6f38e11-ea05-447d-8564-117c0f589d88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:17 crc kubenswrapper[4795]: I0219 22:58:17.985994 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f38e11-ea05-447d-8564-117c0f589d88-kube-api-access-6sctl" (OuterVolumeSpecName: "kube-api-access-6sctl") pod "b6f38e11-ea05-447d-8564-117c0f589d88" (UID: "b6f38e11-ea05-447d-8564-117c0f589d88"). InnerVolumeSpecName "kube-api-access-6sctl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:18 crc kubenswrapper[4795]: I0219 22:58:18.085111 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f38e11-ea05-447d-8564-117c0f589d88-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:18 crc kubenswrapper[4795]: I0219 22:58:18.085144 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sctl\" (UniqueName: \"kubernetes.io/projected/b6f38e11-ea05-447d-8564-117c0f589d88-kube-api-access-6sctl\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:18 crc kubenswrapper[4795]: I0219 22:58:18.414015 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" event={"ID":"32478c4a-a97f-4fd3-84f0-a3c221beefe9","Type":"ContainerDied","Data":"71d309d67a45889afd20dc4c0e2cf14a0a5ee60ed196b26ebb504301dee4774c"} Feb 19 22:58:18 crc kubenswrapper[4795]: I0219 22:58:18.414072 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71d309d67a45889afd20dc4c0e2cf14a0a5ee60ed196b26ebb504301dee4774c" Feb 19 22:58:18 crc kubenswrapper[4795]: I0219 22:58:18.414335 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-93a1-account-create-update-fsvms" Feb 19 22:58:18 crc kubenswrapper[4795]: I0219 22:58:18.417977 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" event={"ID":"b6f38e11-ea05-447d-8564-117c0f589d88","Type":"ContainerDied","Data":"b2256e32f6f1b01f32e54b251404e523a23094a46721ff340018bd197207ffa0"} Feb 19 22:58:18 crc kubenswrapper[4795]: I0219 22:58:18.418038 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2256e32f6f1b01f32e54b251404e523a23094a46721ff340018bd197207ffa0" Feb 19 22:58:18 crc kubenswrapper[4795]: I0219 22:58:18.418122 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-72a3-account-create-update-bfhbs" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.057310 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9nmbd"] Feb 19 22:58:19 crc kubenswrapper[4795]: E0219 22:58:19.057981 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14f4993-80e4-4fbf-a719-22f17750811b" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.057998 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14f4993-80e4-4fbf-a719-22f17750811b" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: E0219 22:58:19.058032 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4e5010-15f4-499e-8279-9a1b814b5490" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058040 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4e5010-15f4-499e-8279-9a1b814b5490" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: E0219 22:58:19.058061 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32478c4a-a97f-4fd3-84f0-a3c221beefe9" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058070 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="32478c4a-a97f-4fd3-84f0-a3c221beefe9" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: E0219 22:58:19.058085 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f811fa4-8fb3-4adc-a9a8-6539dc03494c" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058092 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f811fa4-8fb3-4adc-a9a8-6539dc03494c" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: E0219 22:58:19.058109 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41276d39-878a-4ed2-879b-2a053340874e" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058116 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="41276d39-878a-4ed2-879b-2a053340874e" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: E0219 22:58:19.058129 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f38e11-ea05-447d-8564-117c0f589d88" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058137 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f38e11-ea05-447d-8564-117c0f589d88" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058347 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c14f4993-80e4-4fbf-a719-22f17750811b" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058363 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd4e5010-15f4-499e-8279-9a1b814b5490" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058378 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="32478c4a-a97f-4fd3-84f0-a3c221beefe9" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058392 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6f38e11-ea05-447d-8564-117c0f589d88" containerName="mariadb-account-create-update" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058410 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f811fa4-8fb3-4adc-a9a8-6539dc03494c" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.058426 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="41276d39-878a-4ed2-879b-2a053340874e" containerName="mariadb-database-create" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.059133 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.062119 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ghzps" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.062151 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.062406 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.070049 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9nmbd"] Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.203866 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-scripts\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.204136 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-config-data\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.204417 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lplk\" (UniqueName: \"kubernetes.io/projected/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-kube-api-access-5lplk\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.204493 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.305507 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.305557 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-scripts\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.305648 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-config-data\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.305730 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lplk\" (UniqueName: \"kubernetes.io/projected/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-kube-api-access-5lplk\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.310725 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-scripts\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.310918 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.311872 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-config-data\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.333930 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lplk\" (UniqueName: \"kubernetes.io/projected/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-kube-api-access-5lplk\") pod \"nova-cell0-conductor-db-sync-9nmbd\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.385395 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:19 crc kubenswrapper[4795]: I0219 22:58:19.808998 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9nmbd"] Feb 19 22:58:19 crc kubenswrapper[4795]: W0219 22:58:19.810667 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod303b3e4f_4b2b_4071_b54d_fe4aec3f18f5.slice/crio-c029b8fb1d770dd06af2884e81d13768a10d32b4265b805c747ad400ca0ceb30 WatchSource:0}: Error finding container c029b8fb1d770dd06af2884e81d13768a10d32b4265b805c747ad400ca0ceb30: Status 404 returned error can't find the container with id c029b8fb1d770dd06af2884e81d13768a10d32b4265b805c747ad400ca0ceb30 Feb 19 22:58:20 crc kubenswrapper[4795]: I0219 22:58:20.459225 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" event={"ID":"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5","Type":"ContainerStarted","Data":"eb9dfa4e109128ae704882fd6740207250d7781752ee79e05d86190dadc20b22"} Feb 19 22:58:20 crc kubenswrapper[4795]: I0219 22:58:20.459281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" event={"ID":"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5","Type":"ContainerStarted","Data":"c029b8fb1d770dd06af2884e81d13768a10d32b4265b805c747ad400ca0ceb30"} Feb 19 22:58:20 crc kubenswrapper[4795]: I0219 22:58:20.472755 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" podStartSLOduration=1.472734107 podStartE2EDuration="1.472734107s" podCreationTimestamp="2026-02-19 22:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:20.472402738 +0000 UTC m=+5411.664920632" watchObservedRunningTime="2026-02-19 22:58:20.472734107 +0000 UTC m=+5411.665251971" Feb 19 22:58:25 crc kubenswrapper[4795]: I0219 22:58:25.510355 4795 generic.go:334] "Generic (PLEG): container finished" podID="303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" containerID="eb9dfa4e109128ae704882fd6740207250d7781752ee79e05d86190dadc20b22" exitCode=0 Feb 19 22:58:25 crc kubenswrapper[4795]: I0219 22:58:25.510463 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" event={"ID":"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5","Type":"ContainerDied","Data":"eb9dfa4e109128ae704882fd6740207250d7781752ee79e05d86190dadc20b22"} Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.803699 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.833840 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-combined-ca-bundle\") pod \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.833906 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-config-data\") pod \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.833955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-scripts\") pod \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.834040 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lplk\" (UniqueName: \"kubernetes.io/projected/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-kube-api-access-5lplk\") pod \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\" (UID: \"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5\") " Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.839052 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-kube-api-access-5lplk" (OuterVolumeSpecName: "kube-api-access-5lplk") pod "303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" (UID: "303b3e4f-4b2b-4071-b54d-fe4aec3f18f5"). InnerVolumeSpecName "kube-api-access-5lplk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.839318 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-scripts" (OuterVolumeSpecName: "scripts") pod "303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" (UID: "303b3e4f-4b2b-4071-b54d-fe4aec3f18f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.857751 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" (UID: "303b3e4f-4b2b-4071-b54d-fe4aec3f18f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.859646 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-config-data" (OuterVolumeSpecName: "config-data") pod "303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" (UID: "303b3e4f-4b2b-4071-b54d-fe4aec3f18f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.936350 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lplk\" (UniqueName: \"kubernetes.io/projected/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-kube-api-access-5lplk\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.936391 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.936406 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:26 crc kubenswrapper[4795]: I0219 22:58:26.936418 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.534751 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.534704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9nmbd" event={"ID":"303b3e4f-4b2b-4071-b54d-fe4aec3f18f5","Type":"ContainerDied","Data":"c029b8fb1d770dd06af2884e81d13768a10d32b4265b805c747ad400ca0ceb30"} Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.534877 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c029b8fb1d770dd06af2884e81d13768a10d32b4265b805c747ad400ca0ceb30" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.607250 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 22:58:27 crc kubenswrapper[4795]: E0219 22:58:27.607609 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" containerName="nova-cell0-conductor-db-sync" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.607621 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" containerName="nova-cell0-conductor-db-sync" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.607768 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" containerName="nova-cell0-conductor-db-sync" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.608318 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.611540 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.613181 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ghzps" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.628342 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.648819 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.649043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.649472 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l54h5\" (UniqueName: \"kubernetes.io/projected/9839fd0b-0161-4772-bda3-ddc2914d7e83-kube-api-access-l54h5\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.751230 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l54h5\" (UniqueName: \"kubernetes.io/projected/9839fd0b-0161-4772-bda3-ddc2914d7e83-kube-api-access-l54h5\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.751314 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.751386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.758115 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.758602 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.769152 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l54h5\" (UniqueName: \"kubernetes.io/projected/9839fd0b-0161-4772-bda3-ddc2914d7e83-kube-api-access-l54h5\") pod \"nova-cell0-conductor-0\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:27 crc kubenswrapper[4795]: I0219 22:58:27.927183 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:28 crc kubenswrapper[4795]: W0219 22:58:28.371134 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9839fd0b_0161_4772_bda3_ddc2914d7e83.slice/crio-453e3afffd87b59561c01a967d49235af5b257402e1ea25bffbfe37497656af8 WatchSource:0}: Error finding container 453e3afffd87b59561c01a967d49235af5b257402e1ea25bffbfe37497656af8: Status 404 returned error can't find the container with id 453e3afffd87b59561c01a967d49235af5b257402e1ea25bffbfe37497656af8 Feb 19 22:58:28 crc kubenswrapper[4795]: I0219 22:58:28.372543 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 22:58:28 crc kubenswrapper[4795]: I0219 22:58:28.428236 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:58:28 crc kubenswrapper[4795]: I0219 22:58:28.428486 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:58:28 crc kubenswrapper[4795]: I0219 22:58:28.554231 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9839fd0b-0161-4772-bda3-ddc2914d7e83","Type":"ContainerStarted","Data":"453e3afffd87b59561c01a967d49235af5b257402e1ea25bffbfe37497656af8"} Feb 19 22:58:28 crc kubenswrapper[4795]: I0219 22:58:28.554861 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:28 crc kubenswrapper[4795]: I0219 22:58:28.579236 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.5792107880000001 podStartE2EDuration="1.579210788s" podCreationTimestamp="2026-02-19 22:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:28.572400455 +0000 UTC m=+5419.764918339" watchObservedRunningTime="2026-02-19 22:58:28.579210788 +0000 UTC m=+5419.771728712" Feb 19 22:58:29 crc kubenswrapper[4795]: I0219 22:58:29.565405 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9839fd0b-0161-4772-bda3-ddc2914d7e83","Type":"ContainerStarted","Data":"23d5580181fbc629a99f1fcb391f9c4c8a3cc847543b9789039eab2e110e9ef3"} Feb 19 22:58:37 crc kubenswrapper[4795]: I0219 22:58:37.952571 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.507694 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-cdffm"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.508990 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.511804 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.512256 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.529954 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cdffm"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.610790 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.613276 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.614908 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.620755 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.630699 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-scripts\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.630765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-config-data\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.630921 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.631105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgqqm\" (UniqueName: \"kubernetes.io/projected/7314a002-868e-4028-b341-b719a609e21c-kube-api-access-kgqqm\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.702757 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.703906 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.705908 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.711267 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.733125 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7jkp\" (UniqueName: \"kubernetes.io/projected/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-kube-api-access-f7jkp\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.733201 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.733227 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.733267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgqqm\" (UniqueName: \"kubernetes.io/projected/7314a002-868e-4028-b341-b719a609e21c-kube-api-access-kgqqm\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.733312 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-scripts\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.733351 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.733377 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-config-data\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.742660 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.743324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-scripts\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.750467 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.751784 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.752145 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-config-data\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.755900 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.774649 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgqqm\" (UniqueName: \"kubernetes.io/projected/7314a002-868e-4028-b341-b719a609e21c-kube-api-access-kgqqm\") pod \"nova-cell0-cell-mapping-cdffm\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.793082 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.826741 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.833914 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836279 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836322 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836339 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7knpk\" (UniqueName: \"kubernetes.io/projected/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-kube-api-access-7knpk\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836393 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-logs\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836432 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-config-data\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836467 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836492 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836511 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-config-data\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836529 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmxc5\" (UniqueName: \"kubernetes.io/projected/da050a33-d860-4577-9ce8-6d85bbdef95f-kube-api-access-pmxc5\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.836571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7jkp\" (UniqueName: \"kubernetes.io/projected/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-kube-api-access-f7jkp\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.853938 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.859219 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.873092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.874799 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7jkp\" (UniqueName: \"kubernetes.io/projected/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-kube-api-access-f7jkp\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.880070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.905839 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.935546 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938090 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-config-data\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938137 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b966c788c-4sfvg"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938197 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7knpk\" (UniqueName: \"kubernetes.io/projected/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-kube-api-access-7knpk\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938234 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrbdz\" (UniqueName: \"kubernetes.io/projected/1242edbc-6450-4d81-8c77-a15fe928d782-kube-api-access-nrbdz\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938261 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-logs\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938297 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-config-data\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938320 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938360 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938396 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1242edbc-6450-4d81-8c77-a15fe928d782-logs\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938415 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-config-data\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.938435 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmxc5\" (UniqueName: \"kubernetes.io/projected/da050a33-d860-4577-9ce8-6d85bbdef95f-kube-api-access-pmxc5\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.941462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-logs\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.946760 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b966c788c-4sfvg"] Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.947365 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.955060 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmxc5\" (UniqueName: \"kubernetes.io/projected/da050a33-d860-4577-9ce8-6d85bbdef95f-kube-api-access-pmxc5\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.958327 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.962997 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.963545 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-config-data\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.971704 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-config-data\") pod \"nova-scheduler-0\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:38 crc kubenswrapper[4795]: I0219 22:58:38.982409 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7knpk\" (UniqueName: \"kubernetes.io/projected/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-kube-api-access-7knpk\") pod \"nova-metadata-0\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " pod="openstack/nova-metadata-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.028919 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.040299 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrbdz\" (UniqueName: \"kubernetes.io/projected/1242edbc-6450-4d81-8c77-a15fe928d782-kube-api-access-nrbdz\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.040363 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-config\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.040812 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8j2w\" (UniqueName: \"kubernetes.io/projected/2ff79db5-8006-4a3f-bb73-ab6e32d93186-kube-api-access-q8j2w\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.040842 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.040880 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-nb\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.040917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1242edbc-6450-4d81-8c77-a15fe928d782-logs\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.040989 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-config-data\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.041011 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-dns-svc\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.042321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-sb\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.042729 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1242edbc-6450-4d81-8c77-a15fe928d782-logs\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.052686 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-config-data\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.052837 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.075235 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrbdz\" (UniqueName: \"kubernetes.io/projected/1242edbc-6450-4d81-8c77-a15fe928d782-kube-api-access-nrbdz\") pod \"nova-api-0\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.143694 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-dns-svc\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.144060 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-sb\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.144099 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-config\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.144143 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8j2w\" (UniqueName: \"kubernetes.io/projected/2ff79db5-8006-4a3f-bb73-ab6e32d93186-kube-api-access-q8j2w\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.144186 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-nb\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.144810 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-dns-svc\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.145062 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-nb\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.145194 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-sb\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.145286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-config\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.155099 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.162857 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8j2w\" (UniqueName: \"kubernetes.io/projected/2ff79db5-8006-4a3f-bb73-ab6e32d93186-kube-api-access-q8j2w\") pod \"dnsmasq-dns-6b966c788c-4sfvg\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.264682 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.279671 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.361547 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cdffm"] Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.475251 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 22:58:39 crc kubenswrapper[4795]: W0219 22:58:39.480669 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53af86cc_b0d0_4ba5_9294_4aaa6cef6c09.slice/crio-3cb295d52ddcf76c340ae392323cec591a88fdaec3fd107d8ed54655b0b89d43 WatchSource:0}: Error finding container 3cb295d52ddcf76c340ae392323cec591a88fdaec3fd107d8ed54655b0b89d43: Status 404 returned error can't find the container with id 3cb295d52ddcf76c340ae392323cec591a88fdaec3fd107d8ed54655b0b89d43 Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.553611 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:39 crc kubenswrapper[4795]: W0219 22:58:39.570456 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda050a33_d860_4577_9ce8_6d85bbdef95f.slice/crio-0e85b25b71b392efb52bf0edd446550d919136b3f26e10807779e4867fda37bb WatchSource:0}: Error finding container 0e85b25b71b392efb52bf0edd446550d919136b3f26e10807779e4867fda37bb: Status 404 returned error can't find the container with id 0e85b25b71b392efb52bf0edd446550d919136b3f26e10807779e4867fda37bb Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.656073 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7w2k2"] Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.657911 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.660478 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.661098 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.666281 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7w2k2"] Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.671439 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09","Type":"ContainerStarted","Data":"3cb295d52ddcf76c340ae392323cec591a88fdaec3fd107d8ed54655b0b89d43"} Feb 19 22:58:39 crc kubenswrapper[4795]: W0219 22:58:39.676836 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ce84bcc_7dce_46f4_9ea6_b0f15971eda5.slice/crio-a499dba709ae8a77ef9e5387459f350debf25fd5199f147cbd32741eaacc5a6b WatchSource:0}: Error finding container a499dba709ae8a77ef9e5387459f350debf25fd5199f147cbd32741eaacc5a6b: Status 404 returned error can't find the container with id a499dba709ae8a77ef9e5387459f350debf25fd5199f147cbd32741eaacc5a6b Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.677047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"da050a33-d860-4577-9ce8-6d85bbdef95f","Type":"ContainerStarted","Data":"0e85b25b71b392efb52bf0edd446550d919136b3f26e10807779e4867fda37bb"} Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.681715 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.685070 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cdffm" event={"ID":"7314a002-868e-4028-b341-b719a609e21c","Type":"ContainerStarted","Data":"fd9abe25e7c7b1351c86593c296366b3dd62af78525a82ce72038120bd5feb1c"} Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.685121 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cdffm" event={"ID":"7314a002-868e-4028-b341-b719a609e21c","Type":"ContainerStarted","Data":"8e5ecfafc8caabdbe5e796e27409792620ce0b087abf56978f0301d1a3cdf4ce"} Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.704091 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-cdffm" podStartSLOduration=1.7040706810000001 podStartE2EDuration="1.704070681s" podCreationTimestamp="2026-02-19 22:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:39.699655469 +0000 UTC m=+5430.892173333" watchObservedRunningTime="2026-02-19 22:58:39.704070681 +0000 UTC m=+5430.896588545" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.763331 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-scripts\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.763420 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh5st\" (UniqueName: \"kubernetes.io/projected/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-kube-api-access-gh5st\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.763467 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.763512 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-config-data\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.858266 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b966c788c-4sfvg"] Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.867333 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-scripts\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.867418 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh5st\" (UniqueName: \"kubernetes.io/projected/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-kube-api-access-gh5st\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.867458 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.867496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-config-data\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.868828 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.871906 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-config-data\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.876832 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-scripts\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.885178 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh5st\" (UniqueName: \"kubernetes.io/projected/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-kube-api-access-gh5st\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:39 crc kubenswrapper[4795]: I0219 22:58:39.885713 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7w2k2\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.010852 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.469032 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7w2k2"] Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.695829 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5","Type":"ContainerStarted","Data":"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.695878 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5","Type":"ContainerStarted","Data":"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.695891 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5","Type":"ContainerStarted","Data":"a499dba709ae8a77ef9e5387459f350debf25fd5199f147cbd32741eaacc5a6b"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.697426 4795 generic.go:334] "Generic (PLEG): container finished" podID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" containerID="b1d9f8ec39116ffdbb4a902a6f66f52cc6253d77fc5e2ba0e18cacfaee7d6753" exitCode=0 Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.697515 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" event={"ID":"2ff79db5-8006-4a3f-bb73-ab6e32d93186","Type":"ContainerDied","Data":"b1d9f8ec39116ffdbb4a902a6f66f52cc6253d77fc5e2ba0e18cacfaee7d6753"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.697568 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" event={"ID":"2ff79db5-8006-4a3f-bb73-ab6e32d93186","Type":"ContainerStarted","Data":"ee318881516d5dfefa76dbf8f513dce5d08620986165c943cfdd920651e34509"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.700458 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09","Type":"ContainerStarted","Data":"4cb474362c7411d131561c37cda83e0ffd7519207d531dd518914dff39a66f87"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.703102 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"da050a33-d860-4577-9ce8-6d85bbdef95f","Type":"ContainerStarted","Data":"3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.705146 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" event={"ID":"8065cb60-3c91-4fbc-89f1-7d73d11a85e5","Type":"ContainerStarted","Data":"eae4ff82798e1d79080d2f3fe94f82eb86492da01dc7f84fe2ae4aff95927ca0"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.705202 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" event={"ID":"8065cb60-3c91-4fbc-89f1-7d73d11a85e5","Type":"ContainerStarted","Data":"d200ea8cc3bce3281f900f0e016d10c8a3032b94daf0e389837f13d6cbef13db"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.707366 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1242edbc-6450-4d81-8c77-a15fe928d782","Type":"ContainerStarted","Data":"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.707413 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1242edbc-6450-4d81-8c77-a15fe928d782","Type":"ContainerStarted","Data":"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.707423 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1242edbc-6450-4d81-8c77-a15fe928d782","Type":"ContainerStarted","Data":"977aa2412acf5cdad9aa8c2df443e4a15718820365efcbbb8fd0c0e8dbc6ee3f"} Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.767411 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.767396383 podStartE2EDuration="2.767396383s" podCreationTimestamp="2026-02-19 22:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:40.767200748 +0000 UTC m=+5431.959718612" watchObservedRunningTime="2026-02-19 22:58:40.767396383 +0000 UTC m=+5431.959914247" Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.863461 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.863444768 podStartE2EDuration="2.863444768s" podCreationTimestamp="2026-02-19 22:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:40.861640342 +0000 UTC m=+5432.054158206" watchObservedRunningTime="2026-02-19 22:58:40.863444768 +0000 UTC m=+5432.055962632" Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.868788 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" podStartSLOduration=1.868773424 podStartE2EDuration="1.868773424s" podCreationTimestamp="2026-02-19 22:58:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:40.810371877 +0000 UTC m=+5432.002889741" watchObservedRunningTime="2026-02-19 22:58:40.868773424 +0000 UTC m=+5432.061291278" Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.925068 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.925049606 podStartE2EDuration="2.925049606s" podCreationTimestamp="2026-02-19 22:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:40.899795153 +0000 UTC m=+5432.092313027" watchObservedRunningTime="2026-02-19 22:58:40.925049606 +0000 UTC m=+5432.117567470" Feb 19 22:58:40 crc kubenswrapper[4795]: I0219 22:58:40.928580 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.9285747559999997 podStartE2EDuration="2.928574756s" podCreationTimestamp="2026-02-19 22:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:40.924456911 +0000 UTC m=+5432.116974775" watchObservedRunningTime="2026-02-19 22:58:40.928574756 +0000 UTC m=+5432.121092620" Feb 19 22:58:41 crc kubenswrapper[4795]: I0219 22:58:41.718433 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" event={"ID":"2ff79db5-8006-4a3f-bb73-ab6e32d93186","Type":"ContainerStarted","Data":"b9f475bcb8fe7daa110726dc68979a5f8257c170cc549dfc5c151f5b0ece628f"} Feb 19 22:58:41 crc kubenswrapper[4795]: I0219 22:58:41.720934 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:41 crc kubenswrapper[4795]: I0219 22:58:41.743266 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" podStartSLOduration=3.74324511 podStartE2EDuration="3.74324511s" podCreationTimestamp="2026-02-19 22:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:41.733530323 +0000 UTC m=+5432.926048187" watchObservedRunningTime="2026-02-19 22:58:41.74324511 +0000 UTC m=+5432.935762974" Feb 19 22:58:43 crc kubenswrapper[4795]: I0219 22:58:43.735182 4795 generic.go:334] "Generic (PLEG): container finished" podID="8065cb60-3c91-4fbc-89f1-7d73d11a85e5" containerID="eae4ff82798e1d79080d2f3fe94f82eb86492da01dc7f84fe2ae4aff95927ca0" exitCode=0 Feb 19 22:58:43 crc kubenswrapper[4795]: I0219 22:58:43.735253 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" event={"ID":"8065cb60-3c91-4fbc-89f1-7d73d11a85e5","Type":"ContainerDied","Data":"eae4ff82798e1d79080d2f3fe94f82eb86492da01dc7f84fe2ae4aff95927ca0"} Feb 19 22:58:43 crc kubenswrapper[4795]: I0219 22:58:43.936362 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:44 crc kubenswrapper[4795]: I0219 22:58:44.029194 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 22:58:44 crc kubenswrapper[4795]: I0219 22:58:44.155957 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 22:58:44 crc kubenswrapper[4795]: I0219 22:58:44.156826 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 22:58:44 crc kubenswrapper[4795]: I0219 22:58:44.747227 4795 generic.go:334] "Generic (PLEG): container finished" podID="7314a002-868e-4028-b341-b719a609e21c" containerID="fd9abe25e7c7b1351c86593c296366b3dd62af78525a82ce72038120bd5feb1c" exitCode=0 Feb 19 22:58:44 crc kubenswrapper[4795]: I0219 22:58:44.747549 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cdffm" event={"ID":"7314a002-868e-4028-b341-b719a609e21c","Type":"ContainerDied","Data":"fd9abe25e7c7b1351c86593c296366b3dd62af78525a82ce72038120bd5feb1c"} Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.125355 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.174730 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-combined-ca-bundle\") pod \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.174860 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-config-data\") pod \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.174890 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh5st\" (UniqueName: \"kubernetes.io/projected/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-kube-api-access-gh5st\") pod \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.174987 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-scripts\") pod \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\" (UID: \"8065cb60-3c91-4fbc-89f1-7d73d11a85e5\") " Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.180181 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-scripts" (OuterVolumeSpecName: "scripts") pod "8065cb60-3c91-4fbc-89f1-7d73d11a85e5" (UID: "8065cb60-3c91-4fbc-89f1-7d73d11a85e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.180363 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-kube-api-access-gh5st" (OuterVolumeSpecName: "kube-api-access-gh5st") pod "8065cb60-3c91-4fbc-89f1-7d73d11a85e5" (UID: "8065cb60-3c91-4fbc-89f1-7d73d11a85e5"). InnerVolumeSpecName "kube-api-access-gh5st". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.204484 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8065cb60-3c91-4fbc-89f1-7d73d11a85e5" (UID: "8065cb60-3c91-4fbc-89f1-7d73d11a85e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.205740 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-config-data" (OuterVolumeSpecName: "config-data") pod "8065cb60-3c91-4fbc-89f1-7d73d11a85e5" (UID: "8065cb60-3c91-4fbc-89f1-7d73d11a85e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.276902 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.276929 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh5st\" (UniqueName: \"kubernetes.io/projected/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-kube-api-access-gh5st\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.276938 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.276946 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8065cb60-3c91-4fbc-89f1-7d73d11a85e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.757852 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.757863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7w2k2" event={"ID":"8065cb60-3c91-4fbc-89f1-7d73d11a85e5","Type":"ContainerDied","Data":"d200ea8cc3bce3281f900f0e016d10c8a3032b94daf0e389837f13d6cbef13db"} Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.758296 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d200ea8cc3bce3281f900f0e016d10c8a3032b94daf0e389837f13d6cbef13db" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.840768 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 22:58:45 crc kubenswrapper[4795]: E0219 22:58:45.841266 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8065cb60-3c91-4fbc-89f1-7d73d11a85e5" containerName="nova-cell1-conductor-db-sync" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.841287 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8065cb60-3c91-4fbc-89f1-7d73d11a85e5" containerName="nova-cell1-conductor-db-sync" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.841560 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8065cb60-3c91-4fbc-89f1-7d73d11a85e5" containerName="nova-cell1-conductor-db-sync" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.842342 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.844685 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.890250 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.890809 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.890894 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.891089 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9np5\" (UniqueName: \"kubernetes.io/projected/cb0fc8ea-d689-468f-a612-b87c3e63077d-kube-api-access-w9np5\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.996932 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9np5\" (UniqueName: \"kubernetes.io/projected/cb0fc8ea-d689-468f-a612-b87c3e63077d-kube-api-access-w9np5\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.997058 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:45 crc kubenswrapper[4795]: I0219 22:58:45.997144 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.008933 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.017374 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9np5\" (UniqueName: \"kubernetes.io/projected/cb0fc8ea-d689-468f-a612-b87c3e63077d-kube-api-access-w9np5\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.017568 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.162358 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.269372 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.401899 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-scripts\") pod \"7314a002-868e-4028-b341-b719a609e21c\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.402013 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgqqm\" (UniqueName: \"kubernetes.io/projected/7314a002-868e-4028-b341-b719a609e21c-kube-api-access-kgqqm\") pod \"7314a002-868e-4028-b341-b719a609e21c\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.402297 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-config-data\") pod \"7314a002-868e-4028-b341-b719a609e21c\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.402350 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-combined-ca-bundle\") pod \"7314a002-868e-4028-b341-b719a609e21c\" (UID: \"7314a002-868e-4028-b341-b719a609e21c\") " Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.406619 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7314a002-868e-4028-b341-b719a609e21c-kube-api-access-kgqqm" (OuterVolumeSpecName: "kube-api-access-kgqqm") pod "7314a002-868e-4028-b341-b719a609e21c" (UID: "7314a002-868e-4028-b341-b719a609e21c"). InnerVolumeSpecName "kube-api-access-kgqqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.406682 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-scripts" (OuterVolumeSpecName: "scripts") pod "7314a002-868e-4028-b341-b719a609e21c" (UID: "7314a002-868e-4028-b341-b719a609e21c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.428278 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-config-data" (OuterVolumeSpecName: "config-data") pod "7314a002-868e-4028-b341-b719a609e21c" (UID: "7314a002-868e-4028-b341-b719a609e21c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.430225 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7314a002-868e-4028-b341-b719a609e21c" (UID: "7314a002-868e-4028-b341-b719a609e21c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.504991 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.505307 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgqqm\" (UniqueName: \"kubernetes.io/projected/7314a002-868e-4028-b341-b719a609e21c-kube-api-access-kgqqm\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.505320 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.505333 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7314a002-868e-4028-b341-b719a609e21c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.576544 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.767489 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cb0fc8ea-d689-468f-a612-b87c3e63077d","Type":"ContainerStarted","Data":"59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314"} Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.767537 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cb0fc8ea-d689-468f-a612-b87c3e63077d","Type":"ContainerStarted","Data":"0e3ce7752bf63997ed6557e690f1dee58f948c11df612377dc82c2d8fde569b2"} Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.767654 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.769040 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cdffm" event={"ID":"7314a002-868e-4028-b341-b719a609e21c","Type":"ContainerDied","Data":"8e5ecfafc8caabdbe5e796e27409792620ce0b087abf56978f0301d1a3cdf4ce"} Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.769086 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e5ecfafc8caabdbe5e796e27409792620ce0b087abf56978f0301d1a3cdf4ce" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.769160 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cdffm" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.797453 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.7974353459999999 podStartE2EDuration="1.797435346s" podCreationTimestamp="2026-02-19 22:58:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:46.785482392 +0000 UTC m=+5437.978000296" watchObservedRunningTime="2026-02-19 22:58:46.797435346 +0000 UTC m=+5437.989953210" Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.951502 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.951753 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" containerName="nova-api-log" containerID="cri-o://7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662" gracePeriod=30 Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.951912 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" containerName="nova-api-api" containerID="cri-o://88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413" gracePeriod=30 Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.972180 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.972447 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="da050a33-d860-4577-9ce8-6d85bbdef95f" containerName="nova-scheduler-scheduler" containerID="cri-o://3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd" gracePeriod=30 Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.995073 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.995317 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerName="nova-metadata-log" containerID="cri-o://1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f" gracePeriod=30 Feb 19 22:58:46 crc kubenswrapper[4795]: I0219 22:58:46.995409 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerName="nova-metadata-metadata" containerID="cri-o://cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8" gracePeriod=30 Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.505040 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.625007 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrbdz\" (UniqueName: \"kubernetes.io/projected/1242edbc-6450-4d81-8c77-a15fe928d782-kube-api-access-nrbdz\") pod \"1242edbc-6450-4d81-8c77-a15fe928d782\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.625114 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-config-data\") pod \"1242edbc-6450-4d81-8c77-a15fe928d782\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.625182 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1242edbc-6450-4d81-8c77-a15fe928d782-logs\") pod \"1242edbc-6450-4d81-8c77-a15fe928d782\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.625264 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-combined-ca-bundle\") pod \"1242edbc-6450-4d81-8c77-a15fe928d782\" (UID: \"1242edbc-6450-4d81-8c77-a15fe928d782\") " Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.626508 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1242edbc-6450-4d81-8c77-a15fe928d782-logs" (OuterVolumeSpecName: "logs") pod "1242edbc-6450-4d81-8c77-a15fe928d782" (UID: "1242edbc-6450-4d81-8c77-a15fe928d782"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.631779 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1242edbc-6450-4d81-8c77-a15fe928d782-kube-api-access-nrbdz" (OuterVolumeSpecName: "kube-api-access-nrbdz") pod "1242edbc-6450-4d81-8c77-a15fe928d782" (UID: "1242edbc-6450-4d81-8c77-a15fe928d782"). InnerVolumeSpecName "kube-api-access-nrbdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.654145 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1242edbc-6450-4d81-8c77-a15fe928d782" (UID: "1242edbc-6450-4d81-8c77-a15fe928d782"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.657033 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-config-data" (OuterVolumeSpecName: "config-data") pod "1242edbc-6450-4d81-8c77-a15fe928d782" (UID: "1242edbc-6450-4d81-8c77-a15fe928d782"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.694355 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.726594 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7knpk\" (UniqueName: \"kubernetes.io/projected/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-kube-api-access-7knpk\") pod \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.726708 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-logs\") pod \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.726781 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-config-data\") pod \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.726855 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-combined-ca-bundle\") pod \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\" (UID: \"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5\") " Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.727310 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrbdz\" (UniqueName: \"kubernetes.io/projected/1242edbc-6450-4d81-8c77-a15fe928d782-kube-api-access-nrbdz\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.727330 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.727342 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1242edbc-6450-4d81-8c77-a15fe928d782-logs\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.727353 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1242edbc-6450-4d81-8c77-a15fe928d782-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.727312 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-logs" (OuterVolumeSpecName: "logs") pod "5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" (UID: "5ce84bcc-7dce-46f4-9ea6-b0f15971eda5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.732487 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-kube-api-access-7knpk" (OuterVolumeSpecName: "kube-api-access-7knpk") pod "5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" (UID: "5ce84bcc-7dce-46f4-9ea6-b0f15971eda5"). InnerVolumeSpecName "kube-api-access-7knpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.747525 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-config-data" (OuterVolumeSpecName: "config-data") pod "5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" (UID: "5ce84bcc-7dce-46f4-9ea6-b0f15971eda5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.752133 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" (UID: "5ce84bcc-7dce-46f4-9ea6-b0f15971eda5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.779135 4795 generic.go:334] "Generic (PLEG): container finished" podID="1242edbc-6450-4d81-8c77-a15fe928d782" containerID="88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413" exitCode=0 Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.779206 4795 generic.go:334] "Generic (PLEG): container finished" podID="1242edbc-6450-4d81-8c77-a15fe928d782" containerID="7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662" exitCode=143 Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.779205 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1242edbc-6450-4d81-8c77-a15fe928d782","Type":"ContainerDied","Data":"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413"} Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.779257 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1242edbc-6450-4d81-8c77-a15fe928d782","Type":"ContainerDied","Data":"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662"} Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.779268 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1242edbc-6450-4d81-8c77-a15fe928d782","Type":"ContainerDied","Data":"977aa2412acf5cdad9aa8c2df443e4a15718820365efcbbb8fd0c0e8dbc6ee3f"} Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.779286 4795 scope.go:117] "RemoveContainer" containerID="88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.779223 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.781778 4795 generic.go:334] "Generic (PLEG): container finished" podID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerID="cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8" exitCode=0 Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.781823 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5","Type":"ContainerDied","Data":"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8"} Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.781856 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5","Type":"ContainerDied","Data":"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f"} Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.781819 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.781834 4795 generic.go:334] "Generic (PLEG): container finished" podID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerID="1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f" exitCode=143 Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.781986 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ce84bcc-7dce-46f4-9ea6-b0f15971eda5","Type":"ContainerDied","Data":"a499dba709ae8a77ef9e5387459f350debf25fd5199f147cbd32741eaacc5a6b"} Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.804100 4795 scope.go:117] "RemoveContainer" containerID="7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.837689 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7knpk\" (UniqueName: \"kubernetes.io/projected/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-kube-api-access-7knpk\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.837717 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-logs\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.837727 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.837735 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.839119 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.842980 4795 scope.go:117] "RemoveContainer" containerID="88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413" Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.844608 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413\": container with ID starting with 88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413 not found: ID does not exist" containerID="88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.844640 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413"} err="failed to get container status \"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413\": rpc error: code = NotFound desc = could not find container \"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413\": container with ID starting with 88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413 not found: ID does not exist" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.844661 4795 scope.go:117] "RemoveContainer" containerID="7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662" Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.848652 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662\": container with ID starting with 7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662 not found: ID does not exist" containerID="7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.848694 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662"} err="failed to get container status \"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662\": rpc error: code = NotFound desc = could not find container \"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662\": container with ID starting with 7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662 not found: ID does not exist" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.848720 4795 scope.go:117] "RemoveContainer" containerID="88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.850300 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413"} err="failed to get container status \"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413\": rpc error: code = NotFound desc = could not find container \"88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413\": container with ID starting with 88a0e64a704cb7bfa23aba7f7e671c63558b2f8645737875a12f26fc4e84b413 not found: ID does not exist" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.850333 4795 scope.go:117] "RemoveContainer" containerID="7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.851781 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662"} err="failed to get container status \"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662\": rpc error: code = NotFound desc = could not find container \"7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662\": container with ID starting with 7701a2acbdede4a0972b55ecfa4cf55a62c116819a776ea756674fd7df603662 not found: ID does not exist" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.851810 4795 scope.go:117] "RemoveContainer" containerID="cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.872432 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.887261 4795 scope.go:117] "RemoveContainer" containerID="1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.887382 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.908827 4795 scope.go:117] "RemoveContainer" containerID="cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8" Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.909330 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8\": container with ID starting with cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8 not found: ID does not exist" containerID="cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.909363 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8"} err="failed to get container status \"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8\": rpc error: code = NotFound desc = could not find container \"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8\": container with ID starting with cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8 not found: ID does not exist" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.909382 4795 scope.go:117] "RemoveContainer" containerID="1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f" Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.909697 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f\": container with ID starting with 1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f not found: ID does not exist" containerID="1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.909727 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f"} err="failed to get container status \"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f\": rpc error: code = NotFound desc = could not find container \"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f\": container with ID starting with 1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f not found: ID does not exist" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.909742 4795 scope.go:117] "RemoveContainer" containerID="cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.909961 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8"} err="failed to get container status \"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8\": rpc error: code = NotFound desc = could not find container \"cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8\": container with ID starting with cb5983c6d35380ccf613c286debb3af1df30bb65570107492156109294fb24d8 not found: ID does not exist" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.909974 4795 scope.go:117] "RemoveContainer" containerID="1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.910215 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f"} err="failed to get container status \"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f\": rpc error: code = NotFound desc = could not find container \"1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f\": container with ID starting with 1e099de224e3efa46060b80e9a7e4d9002e2f317ca332dea5494f3070c62d20f not found: ID does not exist" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.914980 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927115 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.927563 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerName="nova-metadata-log" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927586 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerName="nova-metadata-log" Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.927602 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerName="nova-metadata-metadata" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927609 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerName="nova-metadata-metadata" Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.927624 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7314a002-868e-4028-b341-b719a609e21c" containerName="nova-manage" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927631 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7314a002-868e-4028-b341-b719a609e21c" containerName="nova-manage" Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.927640 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" containerName="nova-api-log" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927645 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" containerName="nova-api-log" Feb 19 22:58:47 crc kubenswrapper[4795]: E0219 22:58:47.927667 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" containerName="nova-api-api" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927672 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" containerName="nova-api-api" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927834 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerName="nova-metadata-log" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927847 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" containerName="nova-metadata-metadata" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927863 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" containerName="nova-api-log" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927873 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" containerName="nova-api-api" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.927886 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7314a002-868e-4028-b341-b719a609e21c" containerName="nova-manage" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.928849 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.930578 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.943880 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.955070 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.956502 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.960366 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 22:58:47 crc kubenswrapper[4795]: I0219 22:58:47.964376 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.041454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.041502 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-config-data\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.041693 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbg69\" (UniqueName: \"kubernetes.io/projected/01aec0cd-4f6c-4299-a07d-48bc5b04206c-kube-api-access-cbg69\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.041772 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2afe060c-5092-4927-9f32-02115c78441b-logs\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.042034 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.042155 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01aec0cd-4f6c-4299-a07d-48bc5b04206c-logs\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.042250 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvb9v\" (UniqueName: \"kubernetes.io/projected/2afe060c-5092-4927-9f32-02115c78441b-kube-api-access-gvb9v\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.042360 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-config-data\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.143860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-config-data\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.143946 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.143969 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-config-data\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.143993 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbg69\" (UniqueName: \"kubernetes.io/projected/01aec0cd-4f6c-4299-a07d-48bc5b04206c-kube-api-access-cbg69\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.144010 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2afe060c-5092-4927-9f32-02115c78441b-logs\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.144056 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.144641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2afe060c-5092-4927-9f32-02115c78441b-logs\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.144961 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01aec0cd-4f6c-4299-a07d-48bc5b04206c-logs\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.145070 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvb9v\" (UniqueName: \"kubernetes.io/projected/2afe060c-5092-4927-9f32-02115c78441b-kube-api-access-gvb9v\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.145393 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01aec0cd-4f6c-4299-a07d-48bc5b04206c-logs\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.148312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.148688 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-config-data\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.149195 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-config-data\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.149815 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.160345 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvb9v\" (UniqueName: \"kubernetes.io/projected/2afe060c-5092-4927-9f32-02115c78441b-kube-api-access-gvb9v\") pod \"nova-metadata-0\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.165974 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbg69\" (UniqueName: \"kubernetes.io/projected/01aec0cd-4f6c-4299-a07d-48bc5b04206c-kube-api-access-cbg69\") pod \"nova-api-0\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.246309 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.273229 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.695604 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.755552 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.810076 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01aec0cd-4f6c-4299-a07d-48bc5b04206c","Type":"ContainerStarted","Data":"c46bf84b81c77e4ca078f2984ed1e17e83d62628cf87f4fed723d84f60bf74aa"} Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.811789 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2afe060c-5092-4927-9f32-02115c78441b","Type":"ContainerStarted","Data":"a3e377664d37d778a2b6faa79d4b02ff6f9bf8a82cfec3bee0e75d89770ccd50"} Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.936565 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:48 crc kubenswrapper[4795]: I0219 22:58:48.947972 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.281245 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.352096 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5756cc6d89-g8t6k"] Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.352364 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" podUID="19cb42f3-600f-4079-9dcd-6ba8697d5778" containerName="dnsmasq-dns" containerID="cri-o://8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e" gracePeriod=10 Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.523147 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1242edbc-6450-4d81-8c77-a15fe928d782" path="/var/lib/kubelet/pods/1242edbc-6450-4d81-8c77-a15fe928d782/volumes" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.523977 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ce84bcc-7dce-46f4-9ea6-b0f15971eda5" path="/var/lib/kubelet/pods/5ce84bcc-7dce-46f4-9ea6-b0f15971eda5/volumes" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.820911 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.826731 4795 generic.go:334] "Generic (PLEG): container finished" podID="19cb42f3-600f-4079-9dcd-6ba8697d5778" containerID="8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e" exitCode=0 Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.826795 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" event={"ID":"19cb42f3-600f-4079-9dcd-6ba8697d5778","Type":"ContainerDied","Data":"8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e"} Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.826812 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.826844 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5756cc6d89-g8t6k" event={"ID":"19cb42f3-600f-4079-9dcd-6ba8697d5778","Type":"ContainerDied","Data":"1fe0f90c3062cc804bcfcc62ea300db4a0ac21188d96c43b0dd98bd184a851d6"} Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.826863 4795 scope.go:117] "RemoveContainer" containerID="8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.835209 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01aec0cd-4f6c-4299-a07d-48bc5b04206c","Type":"ContainerStarted","Data":"31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024"} Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.835269 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01aec0cd-4f6c-4299-a07d-48bc5b04206c","Type":"ContainerStarted","Data":"385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c"} Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.842627 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2afe060c-5092-4927-9f32-02115c78441b","Type":"ContainerStarted","Data":"f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f"} Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.842684 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2afe060c-5092-4927-9f32-02115c78441b","Type":"ContainerStarted","Data":"8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01"} Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.858394 4795 scope.go:117] "RemoveContainer" containerID="d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.858544 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.863457 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.86343945 podStartE2EDuration="2.86343945s" podCreationTimestamp="2026-02-19 22:58:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:49.861925881 +0000 UTC m=+5441.054443765" watchObservedRunningTime="2026-02-19 22:58:49.86343945 +0000 UTC m=+5441.055957314" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.877141 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-dns-svc\") pod \"19cb42f3-600f-4079-9dcd-6ba8697d5778\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.877270 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-config\") pod \"19cb42f3-600f-4079-9dcd-6ba8697d5778\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.877391 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz6z5\" (UniqueName: \"kubernetes.io/projected/19cb42f3-600f-4079-9dcd-6ba8697d5778-kube-api-access-nz6z5\") pod \"19cb42f3-600f-4079-9dcd-6ba8697d5778\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.877448 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-sb\") pod \"19cb42f3-600f-4079-9dcd-6ba8697d5778\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.877515 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-nb\") pod \"19cb42f3-600f-4079-9dcd-6ba8697d5778\" (UID: \"19cb42f3-600f-4079-9dcd-6ba8697d5778\") " Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.894075 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19cb42f3-600f-4079-9dcd-6ba8697d5778-kube-api-access-nz6z5" (OuterVolumeSpecName: "kube-api-access-nz6z5") pod "19cb42f3-600f-4079-9dcd-6ba8697d5778" (UID: "19cb42f3-600f-4079-9dcd-6ba8697d5778"). InnerVolumeSpecName "kube-api-access-nz6z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.921433 4795 scope.go:117] "RemoveContainer" containerID="8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e" Feb 19 22:58:49 crc kubenswrapper[4795]: E0219 22:58:49.930738 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e\": container with ID starting with 8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e not found: ID does not exist" containerID="8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.930797 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e"} err="failed to get container status \"8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e\": rpc error: code = NotFound desc = could not find container \"8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e\": container with ID starting with 8576ec3edc0042901926d1839c03c1f30997ed9e98b34eb8f496f95d2763762e not found: ID does not exist" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.930826 4795 scope.go:117] "RemoveContainer" containerID="d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7" Feb 19 22:58:49 crc kubenswrapper[4795]: E0219 22:58:49.934203 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7\": container with ID starting with d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7 not found: ID does not exist" containerID="d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.934242 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7"} err="failed to get container status \"d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7\": rpc error: code = NotFound desc = could not find container \"d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7\": container with ID starting with d4afbfb223a6fc5d4544602fd724857bd42e63b98d9bf900ae5cadc88d7398f7 not found: ID does not exist" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.953611 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "19cb42f3-600f-4079-9dcd-6ba8697d5778" (UID: "19cb42f3-600f-4079-9dcd-6ba8697d5778"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.958361 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.958346085 podStartE2EDuration="2.958346085s" podCreationTimestamp="2026-02-19 22:58:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:49.929545072 +0000 UTC m=+5441.122062936" watchObservedRunningTime="2026-02-19 22:58:49.958346085 +0000 UTC m=+5441.150863939" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.961722 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-config" (OuterVolumeSpecName: "config") pod "19cb42f3-600f-4079-9dcd-6ba8697d5778" (UID: "19cb42f3-600f-4079-9dcd-6ba8697d5778"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.981871 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.981904 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.981902 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "19cb42f3-600f-4079-9dcd-6ba8697d5778" (UID: "19cb42f3-600f-4079-9dcd-6ba8697d5778"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:49 crc kubenswrapper[4795]: I0219 22:58:49.981916 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz6z5\" (UniqueName: \"kubernetes.io/projected/19cb42f3-600f-4079-9dcd-6ba8697d5778-kube-api-access-nz6z5\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:50 crc kubenswrapper[4795]: I0219 22:58:50.013804 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "19cb42f3-600f-4079-9dcd-6ba8697d5778" (UID: "19cb42f3-600f-4079-9dcd-6ba8697d5778"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:58:50 crc kubenswrapper[4795]: I0219 22:58:50.083907 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:50 crc kubenswrapper[4795]: I0219 22:58:50.083937 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19cb42f3-600f-4079-9dcd-6ba8697d5778-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:50 crc kubenswrapper[4795]: I0219 22:58:50.166368 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5756cc6d89-g8t6k"] Feb 19 22:58:50 crc kubenswrapper[4795]: I0219 22:58:50.174462 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5756cc6d89-g8t6k"] Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.193678 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.316012 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.404735 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-combined-ca-bundle\") pod \"da050a33-d860-4577-9ce8-6d85bbdef95f\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.404876 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-config-data\") pod \"da050a33-d860-4577-9ce8-6d85bbdef95f\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.404952 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmxc5\" (UniqueName: \"kubernetes.io/projected/da050a33-d860-4577-9ce8-6d85bbdef95f-kube-api-access-pmxc5\") pod \"da050a33-d860-4577-9ce8-6d85bbdef95f\" (UID: \"da050a33-d860-4577-9ce8-6d85bbdef95f\") " Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.408145 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da050a33-d860-4577-9ce8-6d85bbdef95f-kube-api-access-pmxc5" (OuterVolumeSpecName: "kube-api-access-pmxc5") pod "da050a33-d860-4577-9ce8-6d85bbdef95f" (UID: "da050a33-d860-4577-9ce8-6d85bbdef95f"). InnerVolumeSpecName "kube-api-access-pmxc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.437606 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da050a33-d860-4577-9ce8-6d85bbdef95f" (UID: "da050a33-d860-4577-9ce8-6d85bbdef95f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.437621 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-config-data" (OuterVolumeSpecName: "config-data") pod "da050a33-d860-4577-9ce8-6d85bbdef95f" (UID: "da050a33-d860-4577-9ce8-6d85bbdef95f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.506745 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmxc5\" (UniqueName: \"kubernetes.io/projected/da050a33-d860-4577-9ce8-6d85bbdef95f-kube-api-access-pmxc5\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.506776 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.506786 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da050a33-d860-4577-9ce8-6d85bbdef95f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.525043 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19cb42f3-600f-4079-9dcd-6ba8697d5778" path="/var/lib/kubelet/pods/19cb42f3-600f-4079-9dcd-6ba8697d5778/volumes" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.605278 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-ltwc9"] Feb 19 22:58:51 crc kubenswrapper[4795]: E0219 22:58:51.605664 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da050a33-d860-4577-9ce8-6d85bbdef95f" containerName="nova-scheduler-scheduler" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.605681 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="da050a33-d860-4577-9ce8-6d85bbdef95f" containerName="nova-scheduler-scheduler" Feb 19 22:58:51 crc kubenswrapper[4795]: E0219 22:58:51.605694 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19cb42f3-600f-4079-9dcd-6ba8697d5778" containerName="dnsmasq-dns" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.605699 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="19cb42f3-600f-4079-9dcd-6ba8697d5778" containerName="dnsmasq-dns" Feb 19 22:58:51 crc kubenswrapper[4795]: E0219 22:58:51.605708 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19cb42f3-600f-4079-9dcd-6ba8697d5778" containerName="init" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.605716 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="19cb42f3-600f-4079-9dcd-6ba8697d5778" containerName="init" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.605884 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="da050a33-d860-4577-9ce8-6d85bbdef95f" containerName="nova-scheduler-scheduler" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.605896 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="19cb42f3-600f-4079-9dcd-6ba8697d5778" containerName="dnsmasq-dns" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.606475 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.608943 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.609339 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.627560 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ltwc9"] Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.709660 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btq7g\" (UniqueName: \"kubernetes.io/projected/0532ff51-023e-4663-9c95-6545236a8fb3-kube-api-access-btq7g\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.709745 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-config-data\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.709801 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-scripts\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.710002 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.812336 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.812447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btq7g\" (UniqueName: \"kubernetes.io/projected/0532ff51-023e-4663-9c95-6545236a8fb3-kube-api-access-btq7g\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.812475 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-config-data\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.812526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-scripts\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.817786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.825119 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-config-data\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.825533 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-scripts\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.834016 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btq7g\" (UniqueName: \"kubernetes.io/projected/0532ff51-023e-4663-9c95-6545236a8fb3-kube-api-access-btq7g\") pod \"nova-cell1-cell-mapping-ltwc9\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.875103 4795 generic.go:334] "Generic (PLEG): container finished" podID="da050a33-d860-4577-9ce8-6d85bbdef95f" containerID="3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd" exitCode=0 Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.875155 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"da050a33-d860-4577-9ce8-6d85bbdef95f","Type":"ContainerDied","Data":"3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd"} Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.875203 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"da050a33-d860-4577-9ce8-6d85bbdef95f","Type":"ContainerDied","Data":"0e85b25b71b392efb52bf0edd446550d919136b3f26e10807779e4867fda37bb"} Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.875223 4795 scope.go:117] "RemoveContainer" containerID="3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.875328 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.917435 4795 scope.go:117] "RemoveContainer" containerID="3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.917432 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:51 crc kubenswrapper[4795]: E0219 22:58:51.918793 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd\": container with ID starting with 3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd not found: ID does not exist" containerID="3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.918944 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd"} err="failed to get container status \"3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd\": rpc error: code = NotFound desc = could not find container \"3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd\": container with ID starting with 3d0139e0f9c4cd718f08e91ed33067485c3af22fed1718e1e129fa1a78a292fd not found: ID does not exist" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.930034 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.940807 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.942230 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.944671 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.951607 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:51 crc kubenswrapper[4795]: I0219 22:58:51.955184 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.019499 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-556k9\" (UniqueName: \"kubernetes.io/projected/5eee4148-ad3f-42ae-954b-20103c8869e0-kube-api-access-556k9\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.019559 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-config-data\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.019593 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.120851 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.121477 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-556k9\" (UniqueName: \"kubernetes.io/projected/5eee4148-ad3f-42ae-954b-20103c8869e0-kube-api-access-556k9\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.121540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-config-data\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.131306 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.131482 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-config-data\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.143353 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-556k9\" (UniqueName: \"kubernetes.io/projected/5eee4148-ad3f-42ae-954b-20103c8869e0-kube-api-access-556k9\") pod \"nova-scheduler-0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.261466 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.409309 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ltwc9"] Feb 19 22:58:52 crc kubenswrapper[4795]: W0219 22:58:52.414665 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0532ff51_023e_4663_9c95_6545236a8fb3.slice/crio-f69afc9e6e4cd179463d288656fef123098b33509c0f349c8088c17d49d1fbf7 WatchSource:0}: Error finding container f69afc9e6e4cd179463d288656fef123098b33509c0f349c8088c17d49d1fbf7: Status 404 returned error can't find the container with id f69afc9e6e4cd179463d288656fef123098b33509c0f349c8088c17d49d1fbf7 Feb 19 22:58:52 crc kubenswrapper[4795]: W0219 22:58:52.668798 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5eee4148_ad3f_42ae_954b_20103c8869e0.slice/crio-6f9fc2a856df4fd02d1f188c7bd17a0205125efe04c5dc6a27afb8ab33d1efad WatchSource:0}: Error finding container 6f9fc2a856df4fd02d1f188c7bd17a0205125efe04c5dc6a27afb8ab33d1efad: Status 404 returned error can't find the container with id 6f9fc2a856df4fd02d1f188c7bd17a0205125efe04c5dc6a27afb8ab33d1efad Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.677726 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.884486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ltwc9" event={"ID":"0532ff51-023e-4663-9c95-6545236a8fb3","Type":"ContainerStarted","Data":"3e6c87188fd15e9809b882f54d39f3a4062c4abbd3e889c36c86f66a2c644f00"} Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.884530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ltwc9" event={"ID":"0532ff51-023e-4663-9c95-6545236a8fb3","Type":"ContainerStarted","Data":"f69afc9e6e4cd179463d288656fef123098b33509c0f349c8088c17d49d1fbf7"} Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.886966 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5eee4148-ad3f-42ae-954b-20103c8869e0","Type":"ContainerStarted","Data":"280220290d29bc48350410df59cfc0bb598766f640d75821796b950e43cea3ef"} Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.887002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5eee4148-ad3f-42ae-954b-20103c8869e0","Type":"ContainerStarted","Data":"6f9fc2a856df4fd02d1f188c7bd17a0205125efe04c5dc6a27afb8ab33d1efad"} Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.909155 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-ltwc9" podStartSLOduration=1.909136586 podStartE2EDuration="1.909136586s" podCreationTimestamp="2026-02-19 22:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:52.897208133 +0000 UTC m=+5444.089725997" watchObservedRunningTime="2026-02-19 22:58:52.909136586 +0000 UTC m=+5444.101654440" Feb 19 22:58:52 crc kubenswrapper[4795]: I0219 22:58:52.922868 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.922843815 podStartE2EDuration="1.922843815s" podCreationTimestamp="2026-02-19 22:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:58:52.913187789 +0000 UTC m=+5444.105705653" watchObservedRunningTime="2026-02-19 22:58:52.922843815 +0000 UTC m=+5444.115361679" Feb 19 22:58:53 crc kubenswrapper[4795]: I0219 22:58:53.247518 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 22:58:53 crc kubenswrapper[4795]: I0219 22:58:53.247891 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 22:58:53 crc kubenswrapper[4795]: I0219 22:58:53.520702 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da050a33-d860-4577-9ce8-6d85bbdef95f" path="/var/lib/kubelet/pods/da050a33-d860-4577-9ce8-6d85bbdef95f/volumes" Feb 19 22:58:57 crc kubenswrapper[4795]: I0219 22:58:57.262510 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 22:58:57 crc kubenswrapper[4795]: I0219 22:58:57.946412 4795 generic.go:334] "Generic (PLEG): container finished" podID="0532ff51-023e-4663-9c95-6545236a8fb3" containerID="3e6c87188fd15e9809b882f54d39f3a4062c4abbd3e889c36c86f66a2c644f00" exitCode=0 Feb 19 22:58:57 crc kubenswrapper[4795]: I0219 22:58:57.946513 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ltwc9" event={"ID":"0532ff51-023e-4663-9c95-6545236a8fb3","Type":"ContainerDied","Data":"3e6c87188fd15e9809b882f54d39f3a4062c4abbd3e889c36c86f66a2c644f00"} Feb 19 22:58:58 crc kubenswrapper[4795]: I0219 22:58:58.247234 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 22:58:58 crc kubenswrapper[4795]: I0219 22:58:58.247281 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 22:58:58 crc kubenswrapper[4795]: I0219 22:58:58.274399 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 22:58:58 crc kubenswrapper[4795]: I0219 22:58:58.274460 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 22:58:58 crc kubenswrapper[4795]: I0219 22:58:58.427043 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:58:58 crc kubenswrapper[4795]: I0219 22:58:58.427111 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.294405 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.374967 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-config-data\") pod \"0532ff51-023e-4663-9c95-6545236a8fb3\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.375041 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btq7g\" (UniqueName: \"kubernetes.io/projected/0532ff51-023e-4663-9c95-6545236a8fb3-kube-api-access-btq7g\") pod \"0532ff51-023e-4663-9c95-6545236a8fb3\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.375075 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-scripts\") pod \"0532ff51-023e-4663-9c95-6545236a8fb3\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.375101 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-combined-ca-bundle\") pod \"0532ff51-023e-4663-9c95-6545236a8fb3\" (UID: \"0532ff51-023e-4663-9c95-6545236a8fb3\") " Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.396697 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0532ff51-023e-4663-9c95-6545236a8fb3-kube-api-access-btq7g" (OuterVolumeSpecName: "kube-api-access-btq7g") pod "0532ff51-023e-4663-9c95-6545236a8fb3" (UID: "0532ff51-023e-4663-9c95-6545236a8fb3"). InnerVolumeSpecName "kube-api-access-btq7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.402430 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-scripts" (OuterVolumeSpecName: "scripts") pod "0532ff51-023e-4663-9c95-6545236a8fb3" (UID: "0532ff51-023e-4663-9c95-6545236a8fb3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.409698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0532ff51-023e-4663-9c95-6545236a8fb3" (UID: "0532ff51-023e-4663-9c95-6545236a8fb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.414363 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-config-data" (OuterVolumeSpecName: "config-data") pod "0532ff51-023e-4663-9c95-6545236a8fb3" (UID: "0532ff51-023e-4663-9c95-6545236a8fb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.416315 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.65:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.416367 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.66:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.416412 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.65:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.416676 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.66:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.476958 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.477299 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btq7g\" (UniqueName: \"kubernetes.io/projected/0532ff51-023e-4663-9c95-6545236a8fb3-kube-api-access-btq7g\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.477395 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.477470 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0532ff51-023e-4663-9c95-6545236a8fb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.968272 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ltwc9" event={"ID":"0532ff51-023e-4663-9c95-6545236a8fb3","Type":"ContainerDied","Data":"f69afc9e6e4cd179463d288656fef123098b33509c0f349c8088c17d49d1fbf7"} Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.968722 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f69afc9e6e4cd179463d288656fef123098b33509c0f349c8088c17d49d1fbf7" Feb 19 22:58:59 crc kubenswrapper[4795]: I0219 22:58:59.968415 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ltwc9" Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.162735 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.163019 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-log" containerID="cri-o://385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c" gracePeriod=30 Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.163120 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-api" containerID="cri-o://31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024" gracePeriod=30 Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.173953 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.174453 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5eee4148-ad3f-42ae-954b-20103c8869e0" containerName="nova-scheduler-scheduler" containerID="cri-o://280220290d29bc48350410df59cfc0bb598766f640d75821796b950e43cea3ef" gracePeriod=30 Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.230576 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.230920 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-log" containerID="cri-o://8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01" gracePeriod=30 Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.231727 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-metadata" containerID="cri-o://f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f" gracePeriod=30 Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.979474 4795 generic.go:334] "Generic (PLEG): container finished" podID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerID="385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c" exitCode=143 Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.979570 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01aec0cd-4f6c-4299-a07d-48bc5b04206c","Type":"ContainerDied","Data":"385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c"} Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.981914 4795 generic.go:334] "Generic (PLEG): container finished" podID="2afe060c-5092-4927-9f32-02115c78441b" containerID="8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01" exitCode=143 Feb 19 22:59:00 crc kubenswrapper[4795]: I0219 22:59:00.981960 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2afe060c-5092-4927-9f32-02115c78441b","Type":"ContainerDied","Data":"8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01"} Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.794063 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.872768 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvb9v\" (UniqueName: \"kubernetes.io/projected/2afe060c-5092-4927-9f32-02115c78441b-kube-api-access-gvb9v\") pod \"2afe060c-5092-4927-9f32-02115c78441b\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.872821 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-config-data\") pod \"2afe060c-5092-4927-9f32-02115c78441b\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.873800 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-combined-ca-bundle\") pod \"2afe060c-5092-4927-9f32-02115c78441b\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.873849 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2afe060c-5092-4927-9f32-02115c78441b-logs\") pod \"2afe060c-5092-4927-9f32-02115c78441b\" (UID: \"2afe060c-5092-4927-9f32-02115c78441b\") " Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.874402 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2afe060c-5092-4927-9f32-02115c78441b-logs" (OuterVolumeSpecName: "logs") pod "2afe060c-5092-4927-9f32-02115c78441b" (UID: "2afe060c-5092-4927-9f32-02115c78441b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.879641 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2afe060c-5092-4927-9f32-02115c78441b-kube-api-access-gvb9v" (OuterVolumeSpecName: "kube-api-access-gvb9v") pod "2afe060c-5092-4927-9f32-02115c78441b" (UID: "2afe060c-5092-4927-9f32-02115c78441b"). InnerVolumeSpecName "kube-api-access-gvb9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.914395 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2afe060c-5092-4927-9f32-02115c78441b" (UID: "2afe060c-5092-4927-9f32-02115c78441b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.918803 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-config-data" (OuterVolumeSpecName: "config-data") pod "2afe060c-5092-4927-9f32-02115c78441b" (UID: "2afe060c-5092-4927-9f32-02115c78441b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.975922 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvb9v\" (UniqueName: \"kubernetes.io/projected/2afe060c-5092-4927-9f32-02115c78441b-kube-api-access-gvb9v\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.975956 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.975965 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afe060c-5092-4927-9f32-02115c78441b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:03 crc kubenswrapper[4795]: I0219 22:59:03.975973 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2afe060c-5092-4927-9f32-02115c78441b-logs\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.017080 4795 generic.go:334] "Generic (PLEG): container finished" podID="2afe060c-5092-4927-9f32-02115c78441b" containerID="f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f" exitCode=0 Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.017148 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.017182 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2afe060c-5092-4927-9f32-02115c78441b","Type":"ContainerDied","Data":"f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f"} Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.017226 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2afe060c-5092-4927-9f32-02115c78441b","Type":"ContainerDied","Data":"a3e377664d37d778a2b6faa79d4b02ff6f9bf8a82cfec3bee0e75d89770ccd50"} Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.017248 4795 scope.go:117] "RemoveContainer" containerID="f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.018981 4795 generic.go:334] "Generic (PLEG): container finished" podID="5eee4148-ad3f-42ae-954b-20103c8869e0" containerID="280220290d29bc48350410df59cfc0bb598766f640d75821796b950e43cea3ef" exitCode=0 Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.019011 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5eee4148-ad3f-42ae-954b-20103c8869e0","Type":"ContainerDied","Data":"280220290d29bc48350410df59cfc0bb598766f640d75821796b950e43cea3ef"} Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.052442 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.057012 4795 scope.go:117] "RemoveContainer" containerID="8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.061541 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.116009 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:59:04 crc kubenswrapper[4795]: E0219 22:59:04.118125 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0532ff51-023e-4663-9c95-6545236a8fb3" containerName="nova-manage" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.118153 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0532ff51-023e-4663-9c95-6545236a8fb3" containerName="nova-manage" Feb 19 22:59:04 crc kubenswrapper[4795]: E0219 22:59:04.118196 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-log" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.118206 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-log" Feb 19 22:59:04 crc kubenswrapper[4795]: E0219 22:59:04.118230 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-metadata" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.118241 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-metadata" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.118500 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0532ff51-023e-4663-9c95-6545236a8fb3" containerName="nova-manage" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.118528 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-metadata" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.118544 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2afe060c-5092-4927-9f32-02115c78441b" containerName="nova-metadata-log" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.121016 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.124586 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.124950 4795 scope.go:117] "RemoveContainer" containerID="f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f" Feb 19 22:59:04 crc kubenswrapper[4795]: E0219 22:59:04.126013 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f\": container with ID starting with f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f not found: ID does not exist" containerID="f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.126067 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f"} err="failed to get container status \"f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f\": rpc error: code = NotFound desc = could not find container \"f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f\": container with ID starting with f0890adcd512cdcbdc9b03caf18d2d2d9153afacd9acd8984f390842540c262f not found: ID does not exist" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.126111 4795 scope.go:117] "RemoveContainer" containerID="8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01" Feb 19 22:59:04 crc kubenswrapper[4795]: E0219 22:59:04.128181 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01\": container with ID starting with 8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01 not found: ID does not exist" containerID="8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.128215 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01"} err="failed to get container status \"8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01\": rpc error: code = NotFound desc = could not find container \"8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01\": container with ID starting with 8b2273b05afbf5c3d01bf3929f023270d1f40dc023354cbf3f40e844201a5b01 not found: ID does not exist" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.149975 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.180065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4khb7\" (UniqueName: \"kubernetes.io/projected/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-kube-api-access-4khb7\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.180146 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.180225 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-config-data\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.180421 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-logs\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.281633 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4khb7\" (UniqueName: \"kubernetes.io/projected/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-kube-api-access-4khb7\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.281733 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.281793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-config-data\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.281859 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-logs\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.282364 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-logs\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.287895 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-config-data\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.288876 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.301270 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4khb7\" (UniqueName: \"kubernetes.io/projected/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-kube-api-access-4khb7\") pod \"nova-metadata-0\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.319921 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.383418 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-config-data\") pod \"5eee4148-ad3f-42ae-954b-20103c8869e0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.383575 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-combined-ca-bundle\") pod \"5eee4148-ad3f-42ae-954b-20103c8869e0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.383602 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-556k9\" (UniqueName: \"kubernetes.io/projected/5eee4148-ad3f-42ae-954b-20103c8869e0-kube-api-access-556k9\") pod \"5eee4148-ad3f-42ae-954b-20103c8869e0\" (UID: \"5eee4148-ad3f-42ae-954b-20103c8869e0\") " Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.386455 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eee4148-ad3f-42ae-954b-20103c8869e0-kube-api-access-556k9" (OuterVolumeSpecName: "kube-api-access-556k9") pod "5eee4148-ad3f-42ae-954b-20103c8869e0" (UID: "5eee4148-ad3f-42ae-954b-20103c8869e0"). InnerVolumeSpecName "kube-api-access-556k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.404803 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-config-data" (OuterVolumeSpecName: "config-data") pod "5eee4148-ad3f-42ae-954b-20103c8869e0" (UID: "5eee4148-ad3f-42ae-954b-20103c8869e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.405893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5eee4148-ad3f-42ae-954b-20103c8869e0" (UID: "5eee4148-ad3f-42ae-954b-20103c8869e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.449951 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.486660 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.486727 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-556k9\" (UniqueName: \"kubernetes.io/projected/5eee4148-ad3f-42ae-954b-20103c8869e0-kube-api-access-556k9\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.486743 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eee4148-ad3f-42ae-954b-20103c8869e0-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.883447 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 22:59:04 crc kubenswrapper[4795]: I0219 22:59:04.994684 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.037543 4795 generic.go:334] "Generic (PLEG): container finished" podID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerID="31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024" exitCode=0 Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.037671 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.038385 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01aec0cd-4f6c-4299-a07d-48bc5b04206c","Type":"ContainerDied","Data":"31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024"} Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.038424 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"01aec0cd-4f6c-4299-a07d-48bc5b04206c","Type":"ContainerDied","Data":"c46bf84b81c77e4ca078f2984ed1e17e83d62628cf87f4fed723d84f60bf74aa"} Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.038446 4795 scope.go:117] "RemoveContainer" containerID="31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.048782 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5eee4148-ad3f-42ae-954b-20103c8869e0","Type":"ContainerDied","Data":"6f9fc2a856df4fd02d1f188c7bd17a0205125efe04c5dc6a27afb8ab33d1efad"} Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.048830 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.050710 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a","Type":"ContainerStarted","Data":"6c69b56affcd6e3319b891be2d4392924607b5fdbd45a93af3cf16129ab8f1e4"} Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.076145 4795 scope.go:117] "RemoveContainer" containerID="385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.089290 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.100706 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-config-data\") pod \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.100773 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-combined-ca-bundle\") pod \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.100812 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01aec0cd-4f6c-4299-a07d-48bc5b04206c-logs\") pod \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.100962 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbg69\" (UniqueName: \"kubernetes.io/projected/01aec0cd-4f6c-4299-a07d-48bc5b04206c-kube-api-access-cbg69\") pod \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\" (UID: \"01aec0cd-4f6c-4299-a07d-48bc5b04206c\") " Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.101399 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01aec0cd-4f6c-4299-a07d-48bc5b04206c-logs" (OuterVolumeSpecName: "logs") pod "01aec0cd-4f6c-4299-a07d-48bc5b04206c" (UID: "01aec0cd-4f6c-4299-a07d-48bc5b04206c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.101816 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01aec0cd-4f6c-4299-a07d-48bc5b04206c-logs\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.103885 4795 scope.go:117] "RemoveContainer" containerID="31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024" Feb 19 22:59:05 crc kubenswrapper[4795]: E0219 22:59:05.104283 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024\": container with ID starting with 31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024 not found: ID does not exist" containerID="31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.104324 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024"} err="failed to get container status \"31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024\": rpc error: code = NotFound desc = could not find container \"31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024\": container with ID starting with 31f457d4286b799feb10f9adf36e58bdbdddb82beffbf370d21f4ced396d7024 not found: ID does not exist" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.104346 4795 scope.go:117] "RemoveContainer" containerID="385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.104473 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01aec0cd-4f6c-4299-a07d-48bc5b04206c-kube-api-access-cbg69" (OuterVolumeSpecName: "kube-api-access-cbg69") pod "01aec0cd-4f6c-4299-a07d-48bc5b04206c" (UID: "01aec0cd-4f6c-4299-a07d-48bc5b04206c"). InnerVolumeSpecName "kube-api-access-cbg69". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:59:05 crc kubenswrapper[4795]: E0219 22:59:05.104669 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c\": container with ID starting with 385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c not found: ID does not exist" containerID="385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.104701 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c"} err="failed to get container status \"385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c\": rpc error: code = NotFound desc = could not find container \"385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c\": container with ID starting with 385cd5652e31298bb87ed87e9552ea3a1a0c6429e349e8ce075cfcc0f4e1be2c not found: ID does not exist" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.104719 4795 scope.go:117] "RemoveContainer" containerID="280220290d29bc48350410df59cfc0bb598766f640d75821796b950e43cea3ef" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.109098 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.117297 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: E0219 22:59:05.117654 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-api" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.117670 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-api" Feb 19 22:59:05 crc kubenswrapper[4795]: E0219 22:59:05.117682 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eee4148-ad3f-42ae-954b-20103c8869e0" containerName="nova-scheduler-scheduler" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.117688 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eee4148-ad3f-42ae-954b-20103c8869e0" containerName="nova-scheduler-scheduler" Feb 19 22:59:05 crc kubenswrapper[4795]: E0219 22:59:05.117699 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-log" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.117705 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-log" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.117873 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-log" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.117890 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eee4148-ad3f-42ae-954b-20103c8869e0" containerName="nova-scheduler-scheduler" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.117902 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" containerName="nova-api-api" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.118512 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.120604 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.124496 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.125237 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01aec0cd-4f6c-4299-a07d-48bc5b04206c" (UID: "01aec0cd-4f6c-4299-a07d-48bc5b04206c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.143041 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-config-data" (OuterVolumeSpecName: "config-data") pod "01aec0cd-4f6c-4299-a07d-48bc5b04206c" (UID: "01aec0cd-4f6c-4299-a07d-48bc5b04206c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.202945 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gp8x\" (UniqueName: \"kubernetes.io/projected/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-kube-api-access-7gp8x\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.203043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.203078 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-config-data\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.203308 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbg69\" (UniqueName: \"kubernetes.io/projected/01aec0cd-4f6c-4299-a07d-48bc5b04206c-kube-api-access-cbg69\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.203333 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.203344 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01aec0cd-4f6c-4299-a07d-48bc5b04206c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.304725 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gp8x\" (UniqueName: \"kubernetes.io/projected/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-kube-api-access-7gp8x\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.304801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.304830 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-config-data\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.308441 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-config-data\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.308466 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.320452 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gp8x\" (UniqueName: \"kubernetes.io/projected/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-kube-api-access-7gp8x\") pod \"nova-scheduler-0\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.369631 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.378287 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.389617 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.391108 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.397353 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.406856 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.445532 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.507605 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.507847 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnff6\" (UniqueName: \"kubernetes.io/projected/30b4399f-d971-4131-8243-3c45e8353cdd-kube-api-access-qnff6\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.507957 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30b4399f-d971-4131-8243-3c45e8353cdd-logs\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.508044 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-config-data\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.523621 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01aec0cd-4f6c-4299-a07d-48bc5b04206c" path="/var/lib/kubelet/pods/01aec0cd-4f6c-4299-a07d-48bc5b04206c/volumes" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.524252 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2afe060c-5092-4927-9f32-02115c78441b" path="/var/lib/kubelet/pods/2afe060c-5092-4927-9f32-02115c78441b/volumes" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.524836 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eee4148-ad3f-42ae-954b-20103c8869e0" path="/var/lib/kubelet/pods/5eee4148-ad3f-42ae-954b-20103c8869e0/volumes" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.609854 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnff6\" (UniqueName: \"kubernetes.io/projected/30b4399f-d971-4131-8243-3c45e8353cdd-kube-api-access-qnff6\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.609936 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30b4399f-d971-4131-8243-3c45e8353cdd-logs\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.609978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-config-data\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.610015 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.610760 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30b4399f-d971-4131-8243-3c45e8353cdd-logs\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.616110 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-config-data\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.616759 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.624353 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnff6\" (UniqueName: \"kubernetes.io/projected/30b4399f-d971-4131-8243-3c45e8353cdd-kube-api-access-qnff6\") pod \"nova-api-0\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.713695 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 22:59:05 crc kubenswrapper[4795]: I0219 22:59:05.877095 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 22:59:05 crc kubenswrapper[4795]: W0219 22:59:05.880634 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7e72169_9bdf_4b8f_9f8d_ca2a994287da.slice/crio-914300143cab992575f14bc983146583fd096f4199ceab449ccc8d0f6f188747 WatchSource:0}: Error finding container 914300143cab992575f14bc983146583fd096f4199ceab449ccc8d0f6f188747: Status 404 returned error can't find the container with id 914300143cab992575f14bc983146583fd096f4199ceab449ccc8d0f6f188747 Feb 19 22:59:06 crc kubenswrapper[4795]: I0219 22:59:06.060414 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f7e72169-9bdf-4b8f-9f8d-ca2a994287da","Type":"ContainerStarted","Data":"44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5"} Feb 19 22:59:06 crc kubenswrapper[4795]: I0219 22:59:06.060453 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f7e72169-9bdf-4b8f-9f8d-ca2a994287da","Type":"ContainerStarted","Data":"914300143cab992575f14bc983146583fd096f4199ceab449ccc8d0f6f188747"} Feb 19 22:59:06 crc kubenswrapper[4795]: I0219 22:59:06.065310 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a","Type":"ContainerStarted","Data":"a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6"} Feb 19 22:59:06 crc kubenswrapper[4795]: I0219 22:59:06.065356 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a","Type":"ContainerStarted","Data":"3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e"} Feb 19 22:59:06 crc kubenswrapper[4795]: I0219 22:59:06.085321 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.085305366 podStartE2EDuration="1.085305366s" podCreationTimestamp="2026-02-19 22:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:59:06.075141687 +0000 UTC m=+5457.267659561" watchObservedRunningTime="2026-02-19 22:59:06.085305366 +0000 UTC m=+5457.277823230" Feb 19 22:59:06 crc kubenswrapper[4795]: I0219 22:59:06.143741 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.143717403 podStartE2EDuration="2.143717403s" podCreationTimestamp="2026-02-19 22:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:59:06.095340781 +0000 UTC m=+5457.287858665" watchObservedRunningTime="2026-02-19 22:59:06.143717403 +0000 UTC m=+5457.336235277" Feb 19 22:59:06 crc kubenswrapper[4795]: I0219 22:59:06.149536 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 22:59:06 crc kubenswrapper[4795]: W0219 22:59:06.152729 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30b4399f_d971_4131_8243_3c45e8353cdd.slice/crio-d7a27d1be53c0ad8661e73984444d15fa7b73baf2b5131526b22e61e61977caf WatchSource:0}: Error finding container d7a27d1be53c0ad8661e73984444d15fa7b73baf2b5131526b22e61e61977caf: Status 404 returned error can't find the container with id d7a27d1be53c0ad8661e73984444d15fa7b73baf2b5131526b22e61e61977caf Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.080221 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30b4399f-d971-4131-8243-3c45e8353cdd","Type":"ContainerStarted","Data":"6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f"} Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.080601 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30b4399f-d971-4131-8243-3c45e8353cdd","Type":"ContainerStarted","Data":"9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff"} Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.080619 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30b4399f-d971-4131-8243-3c45e8353cdd","Type":"ContainerStarted","Data":"d7a27d1be53c0ad8661e73984444d15fa7b73baf2b5131526b22e61e61977caf"} Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.132485 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.132458428 podStartE2EDuration="2.132458428s" podCreationTimestamp="2026-02-19 22:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:59:07.124835354 +0000 UTC m=+5458.317353268" watchObservedRunningTime="2026-02-19 22:59:07.132458428 +0000 UTC m=+5458.324976312" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.503628 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p727f"] Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.520759 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.548744 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-utilities\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.548972 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-catalog-content\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.549046 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztkd9\" (UniqueName: \"kubernetes.io/projected/082b6029-bcb1-409b-b9a4-62f6c1593d5c-kube-api-access-ztkd9\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.569360 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p727f"] Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.651149 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-utilities\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.651261 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-catalog-content\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.651307 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztkd9\" (UniqueName: \"kubernetes.io/projected/082b6029-bcb1-409b-b9a4-62f6c1593d5c-kube-api-access-ztkd9\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.651770 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-catalog-content\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.651792 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-utilities\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.668830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztkd9\" (UniqueName: \"kubernetes.io/projected/082b6029-bcb1-409b-b9a4-62f6c1593d5c-kube-api-access-ztkd9\") pod \"community-operators-p727f\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:07 crc kubenswrapper[4795]: I0219 22:59:07.852094 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:08 crc kubenswrapper[4795]: I0219 22:59:08.333969 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p727f"] Feb 19 22:59:09 crc kubenswrapper[4795]: I0219 22:59:09.105594 4795 generic.go:334] "Generic (PLEG): container finished" podID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerID="2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2" exitCode=0 Feb 19 22:59:09 crc kubenswrapper[4795]: I0219 22:59:09.105679 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p727f" event={"ID":"082b6029-bcb1-409b-b9a4-62f6c1593d5c","Type":"ContainerDied","Data":"2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2"} Feb 19 22:59:09 crc kubenswrapper[4795]: I0219 22:59:09.105733 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p727f" event={"ID":"082b6029-bcb1-409b-b9a4-62f6c1593d5c","Type":"ContainerStarted","Data":"80b6d7fc4851512ae99323cad61b29e71941580cd0bda6065a46f6fe3bd3ed00"} Feb 19 22:59:09 crc kubenswrapper[4795]: I0219 22:59:09.107928 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 22:59:09 crc kubenswrapper[4795]: I0219 22:59:09.450537 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 22:59:09 crc kubenswrapper[4795]: I0219 22:59:09.450586 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 22:59:10 crc kubenswrapper[4795]: I0219 22:59:10.115323 4795 generic.go:334] "Generic (PLEG): container finished" podID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerID="6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019" exitCode=0 Feb 19 22:59:10 crc kubenswrapper[4795]: I0219 22:59:10.115446 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p727f" event={"ID":"082b6029-bcb1-409b-b9a4-62f6c1593d5c","Type":"ContainerDied","Data":"6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019"} Feb 19 22:59:10 crc kubenswrapper[4795]: I0219 22:59:10.445675 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 22:59:11 crc kubenswrapper[4795]: I0219 22:59:11.126898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p727f" event={"ID":"082b6029-bcb1-409b-b9a4-62f6c1593d5c","Type":"ContainerStarted","Data":"7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131"} Feb 19 22:59:11 crc kubenswrapper[4795]: I0219 22:59:11.156062 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p727f" podStartSLOduration=2.725196786 podStartE2EDuration="4.156040483s" podCreationTimestamp="2026-02-19 22:59:07 +0000 UTC" firstStartedPulling="2026-02-19 22:59:09.107528365 +0000 UTC m=+5460.300046259" lastFinishedPulling="2026-02-19 22:59:10.538372052 +0000 UTC m=+5461.730889956" observedRunningTime="2026-02-19 22:59:11.148582923 +0000 UTC m=+5462.341100797" watchObservedRunningTime="2026-02-19 22:59:11.156040483 +0000 UTC m=+5462.348558347" Feb 19 22:59:14 crc kubenswrapper[4795]: I0219 22:59:14.450468 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 22:59:14 crc kubenswrapper[4795]: I0219 22:59:14.451282 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 22:59:15 crc kubenswrapper[4795]: I0219 22:59:15.445840 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 22:59:15 crc kubenswrapper[4795]: I0219 22:59:15.474680 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 22:59:15 crc kubenswrapper[4795]: I0219 22:59:15.534282 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.69:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 22:59:15 crc kubenswrapper[4795]: I0219 22:59:15.534282 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.69:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 22:59:15 crc kubenswrapper[4795]: I0219 22:59:15.601650 4795 scope.go:117] "RemoveContainer" containerID="ccef51520029c7e6d5f912ec1258a3f7218e74073cc8ab1ae05322f167a9a678" Feb 19 22:59:15 crc kubenswrapper[4795]: I0219 22:59:15.714635 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 22:59:15 crc kubenswrapper[4795]: I0219 22:59:15.714690 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 22:59:16 crc kubenswrapper[4795]: I0219 22:59:16.204916 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 22:59:16 crc kubenswrapper[4795]: I0219 22:59:16.797608 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 22:59:16 crc kubenswrapper[4795]: I0219 22:59:16.797693 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 22:59:17 crc kubenswrapper[4795]: I0219 22:59:17.853013 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:17 crc kubenswrapper[4795]: I0219 22:59:17.853946 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:17 crc kubenswrapper[4795]: I0219 22:59:17.927965 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:18 crc kubenswrapper[4795]: I0219 22:59:18.240271 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:18 crc kubenswrapper[4795]: I0219 22:59:18.305803 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p727f"] Feb 19 22:59:20 crc kubenswrapper[4795]: I0219 22:59:20.203140 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p727f" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerName="registry-server" containerID="cri-o://7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131" gracePeriod=2 Feb 19 22:59:20 crc kubenswrapper[4795]: I0219 22:59:20.726877 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:20 crc kubenswrapper[4795]: I0219 22:59:20.908118 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-utilities\") pod \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " Feb 19 22:59:20 crc kubenswrapper[4795]: I0219 22:59:20.908361 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-catalog-content\") pod \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " Feb 19 22:59:20 crc kubenswrapper[4795]: I0219 22:59:20.908422 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztkd9\" (UniqueName: \"kubernetes.io/projected/082b6029-bcb1-409b-b9a4-62f6c1593d5c-kube-api-access-ztkd9\") pod \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\" (UID: \"082b6029-bcb1-409b-b9a4-62f6c1593d5c\") " Feb 19 22:59:20 crc kubenswrapper[4795]: I0219 22:59:20.909484 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-utilities" (OuterVolumeSpecName: "utilities") pod "082b6029-bcb1-409b-b9a4-62f6c1593d5c" (UID: "082b6029-bcb1-409b-b9a4-62f6c1593d5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:59:20 crc kubenswrapper[4795]: I0219 22:59:20.917230 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/082b6029-bcb1-409b-b9a4-62f6c1593d5c-kube-api-access-ztkd9" (OuterVolumeSpecName: "kube-api-access-ztkd9") pod "082b6029-bcb1-409b-b9a4-62f6c1593d5c" (UID: "082b6029-bcb1-409b-b9a4-62f6c1593d5c"). InnerVolumeSpecName "kube-api-access-ztkd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:59:20 crc kubenswrapper[4795]: I0219 22:59:20.956666 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "082b6029-bcb1-409b-b9a4-62f6c1593d5c" (UID: "082b6029-bcb1-409b-b9a4-62f6c1593d5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.010211 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.010248 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztkd9\" (UniqueName: \"kubernetes.io/projected/082b6029-bcb1-409b-b9a4-62f6c1593d5c-kube-api-access-ztkd9\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.010260 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/082b6029-bcb1-409b-b9a4-62f6c1593d5c-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.222674 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p727f" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.222786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p727f" event={"ID":"082b6029-bcb1-409b-b9a4-62f6c1593d5c","Type":"ContainerDied","Data":"7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131"} Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.222930 4795 scope.go:117] "RemoveContainer" containerID="7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.222517 4795 generic.go:334] "Generic (PLEG): container finished" podID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerID="7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131" exitCode=0 Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.223330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p727f" event={"ID":"082b6029-bcb1-409b-b9a4-62f6c1593d5c","Type":"ContainerDied","Data":"80b6d7fc4851512ae99323cad61b29e71941580cd0bda6065a46f6fe3bd3ed00"} Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.244958 4795 scope.go:117] "RemoveContainer" containerID="6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.274423 4795 scope.go:117] "RemoveContainer" containerID="2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.282811 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p727f"] Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.292983 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p727f"] Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.335840 4795 scope.go:117] "RemoveContainer" containerID="7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131" Feb 19 22:59:21 crc kubenswrapper[4795]: E0219 22:59:21.336436 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131\": container with ID starting with 7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131 not found: ID does not exist" containerID="7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.336505 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131"} err="failed to get container status \"7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131\": rpc error: code = NotFound desc = could not find container \"7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131\": container with ID starting with 7de5f0f6ec1305f429e795a9b79007d4a1f18d5b81b88addbe819227b2d93131 not found: ID does not exist" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.336533 4795 scope.go:117] "RemoveContainer" containerID="6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019" Feb 19 22:59:21 crc kubenswrapper[4795]: E0219 22:59:21.337133 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019\": container with ID starting with 6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019 not found: ID does not exist" containerID="6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.337185 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019"} err="failed to get container status \"6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019\": rpc error: code = NotFound desc = could not find container \"6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019\": container with ID starting with 6b72ed317be26fa3ef273dca3faa7a366a13e5079a1e86af64fa68cbb9229019 not found: ID does not exist" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.337218 4795 scope.go:117] "RemoveContainer" containerID="2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2" Feb 19 22:59:21 crc kubenswrapper[4795]: E0219 22:59:21.337526 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2\": container with ID starting with 2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2 not found: ID does not exist" containerID="2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.337561 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2"} err="failed to get container status \"2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2\": rpc error: code = NotFound desc = could not find container \"2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2\": container with ID starting with 2bf568d8d6d6ee14a8fb0d6a33e9393118dff3752d865483cdb6f6e46d9abaa2 not found: ID does not exist" Feb 19 22:59:21 crc kubenswrapper[4795]: I0219 22:59:21.534806 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" path="/var/lib/kubelet/pods/082b6029-bcb1-409b-b9a4-62f6c1593d5c/volumes" Feb 19 22:59:24 crc kubenswrapper[4795]: I0219 22:59:24.452687 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 22:59:24 crc kubenswrapper[4795]: I0219 22:59:24.453316 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 22:59:24 crc kubenswrapper[4795]: I0219 22:59:24.455080 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 22:59:24 crc kubenswrapper[4795]: I0219 22:59:24.455870 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 22:59:25 crc kubenswrapper[4795]: I0219 22:59:25.718046 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 22:59:25 crc kubenswrapper[4795]: I0219 22:59:25.718644 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 22:59:25 crc kubenswrapper[4795]: I0219 22:59:25.718888 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 22:59:25 crc kubenswrapper[4795]: I0219 22:59:25.721564 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.266353 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.272520 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.477155 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bd886b897-t5785"] Feb 19 22:59:26 crc kubenswrapper[4795]: E0219 22:59:26.477526 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerName="extract-utilities" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.477549 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerName="extract-utilities" Feb 19 22:59:26 crc kubenswrapper[4795]: E0219 22:59:26.477584 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerName="registry-server" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.477591 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerName="registry-server" Feb 19 22:59:26 crc kubenswrapper[4795]: E0219 22:59:26.477605 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerName="extract-content" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.477611 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerName="extract-content" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.482428 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="082b6029-bcb1-409b-b9a4-62f6c1593d5c" containerName="registry-server" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.483492 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.499553 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bd886b897-t5785"] Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.608271 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmqvw\" (UniqueName: \"kubernetes.io/projected/25125096-9221-46cf-9c10-21242922dc39-kube-api-access-jmqvw\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.608315 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-sb\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.608379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-config\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.608438 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-nb\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.608474 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-dns-svc\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.711211 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmqvw\" (UniqueName: \"kubernetes.io/projected/25125096-9221-46cf-9c10-21242922dc39-kube-api-access-jmqvw\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.711285 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-sb\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.711356 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-config\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.711418 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-nb\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.711459 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-dns-svc\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.712517 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-config\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.712593 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-sb\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.712835 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-dns-svc\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.713095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-nb\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.733882 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmqvw\" (UniqueName: \"kubernetes.io/projected/25125096-9221-46cf-9c10-21242922dc39-kube-api-access-jmqvw\") pod \"dnsmasq-dns-bd886b897-t5785\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:26 crc kubenswrapper[4795]: I0219 22:59:26.804651 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:27 crc kubenswrapper[4795]: I0219 22:59:27.268396 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bd886b897-t5785"] Feb 19 22:59:27 crc kubenswrapper[4795]: W0219 22:59:27.273318 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25125096_9221_46cf_9c10_21242922dc39.slice/crio-cb323091e0b46dedb3513673e4bcc0af0bdc32ed871a599a68f0f38aa938ff1f WatchSource:0}: Error finding container cb323091e0b46dedb3513673e4bcc0af0bdc32ed871a599a68f0f38aa938ff1f: Status 404 returned error can't find the container with id cb323091e0b46dedb3513673e4bcc0af0bdc32ed871a599a68f0f38aa938ff1f Feb 19 22:59:28 crc kubenswrapper[4795]: I0219 22:59:28.301310 4795 generic.go:334] "Generic (PLEG): container finished" podID="25125096-9221-46cf-9c10-21242922dc39" containerID="1208585385cb268d9e64c92ccccb01209cc9892bf210d3382d48ae16872a3290" exitCode=0 Feb 19 22:59:28 crc kubenswrapper[4795]: I0219 22:59:28.301613 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd886b897-t5785" event={"ID":"25125096-9221-46cf-9c10-21242922dc39","Type":"ContainerDied","Data":"1208585385cb268d9e64c92ccccb01209cc9892bf210d3382d48ae16872a3290"} Feb 19 22:59:28 crc kubenswrapper[4795]: I0219 22:59:28.302061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd886b897-t5785" event={"ID":"25125096-9221-46cf-9c10-21242922dc39","Type":"ContainerStarted","Data":"cb323091e0b46dedb3513673e4bcc0af0bdc32ed871a599a68f0f38aa938ff1f"} Feb 19 22:59:28 crc kubenswrapper[4795]: I0219 22:59:28.428150 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 22:59:28 crc kubenswrapper[4795]: I0219 22:59:28.428224 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 22:59:28 crc kubenswrapper[4795]: I0219 22:59:28.428267 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 22:59:28 crc kubenswrapper[4795]: I0219 22:59:28.428959 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd820b5d0adc1705d78ec76939670a71a79ca07206c8d0459e23712d0b015f16"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 22:59:28 crc kubenswrapper[4795]: I0219 22:59:28.429013 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://fd820b5d0adc1705d78ec76939670a71a79ca07206c8d0459e23712d0b015f16" gracePeriod=600 Feb 19 22:59:29 crc kubenswrapper[4795]: I0219 22:59:29.312609 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="fd820b5d0adc1705d78ec76939670a71a79ca07206c8d0459e23712d0b015f16" exitCode=0 Feb 19 22:59:29 crc kubenswrapper[4795]: I0219 22:59:29.313154 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"fd820b5d0adc1705d78ec76939670a71a79ca07206c8d0459e23712d0b015f16"} Feb 19 22:59:29 crc kubenswrapper[4795]: I0219 22:59:29.313195 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c"} Feb 19 22:59:29 crc kubenswrapper[4795]: I0219 22:59:29.313210 4795 scope.go:117] "RemoveContainer" containerID="4bde20d829d39ae7600d60632441672a1a7026beecea3debeab953cdf5de428a" Feb 19 22:59:29 crc kubenswrapper[4795]: I0219 22:59:29.316208 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd886b897-t5785" event={"ID":"25125096-9221-46cf-9c10-21242922dc39","Type":"ContainerStarted","Data":"6b2bdae1ece62dc20662e376419faadb82e3b42065ea06a9558f606c33c4b7d2"} Feb 19 22:59:29 crc kubenswrapper[4795]: I0219 22:59:29.316866 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:29 crc kubenswrapper[4795]: I0219 22:59:29.359133 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bd886b897-t5785" podStartSLOduration=3.358915398 podStartE2EDuration="3.358915398s" podCreationTimestamp="2026-02-19 22:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:59:29.353318126 +0000 UTC m=+5480.545836000" watchObservedRunningTime="2026-02-19 22:59:29.358915398 +0000 UTC m=+5480.551433262" Feb 19 22:59:36 crc kubenswrapper[4795]: I0219 22:59:36.806382 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 22:59:36 crc kubenswrapper[4795]: I0219 22:59:36.878216 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b966c788c-4sfvg"] Feb 19 22:59:36 crc kubenswrapper[4795]: I0219 22:59:36.878500 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" podUID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" containerName="dnsmasq-dns" containerID="cri-o://b9f475bcb8fe7daa110726dc68979a5f8257c170cc549dfc5c151f5b0ece628f" gracePeriod=10 Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.392225 4795 generic.go:334] "Generic (PLEG): container finished" podID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" containerID="b9f475bcb8fe7daa110726dc68979a5f8257c170cc549dfc5c151f5b0ece628f" exitCode=0 Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.392623 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" event={"ID":"2ff79db5-8006-4a3f-bb73-ab6e32d93186","Type":"ContainerDied","Data":"b9f475bcb8fe7daa110726dc68979a5f8257c170cc549dfc5c151f5b0ece628f"} Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.392655 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" event={"ID":"2ff79db5-8006-4a3f-bb73-ab6e32d93186","Type":"ContainerDied","Data":"ee318881516d5dfefa76dbf8f513dce5d08620986165c943cfdd920651e34509"} Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.392668 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee318881516d5dfefa76dbf8f513dce5d08620986165c943cfdd920651e34509" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.402432 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.530398 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-sb\") pod \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.530728 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8j2w\" (UniqueName: \"kubernetes.io/projected/2ff79db5-8006-4a3f-bb73-ab6e32d93186-kube-api-access-q8j2w\") pod \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.530795 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-dns-svc\") pod \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.530816 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-nb\") pod \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.530858 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-config\") pod \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\" (UID: \"2ff79db5-8006-4a3f-bb73-ab6e32d93186\") " Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.543341 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff79db5-8006-4a3f-bb73-ab6e32d93186-kube-api-access-q8j2w" (OuterVolumeSpecName: "kube-api-access-q8j2w") pod "2ff79db5-8006-4a3f-bb73-ab6e32d93186" (UID: "2ff79db5-8006-4a3f-bb73-ab6e32d93186"). InnerVolumeSpecName "kube-api-access-q8j2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.570999 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-config" (OuterVolumeSpecName: "config") pod "2ff79db5-8006-4a3f-bb73-ab6e32d93186" (UID: "2ff79db5-8006-4a3f-bb73-ab6e32d93186"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.576142 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2ff79db5-8006-4a3f-bb73-ab6e32d93186" (UID: "2ff79db5-8006-4a3f-bb73-ab6e32d93186"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.577998 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ff79db5-8006-4a3f-bb73-ab6e32d93186" (UID: "2ff79db5-8006-4a3f-bb73-ab6e32d93186"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.584813 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ff79db5-8006-4a3f-bb73-ab6e32d93186" (UID: "2ff79db5-8006-4a3f-bb73-ab6e32d93186"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.632820 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.632854 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8j2w\" (UniqueName: \"kubernetes.io/projected/2ff79db5-8006-4a3f-bb73-ab6e32d93186-kube-api-access-q8j2w\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.632865 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.632873 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:37 crc kubenswrapper[4795]: I0219 22:59:37.632883 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff79db5-8006-4a3f-bb73-ab6e32d93186-config\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.401609 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b966c788c-4sfvg" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.447436 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b966c788c-4sfvg"] Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.459664 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b966c788c-4sfvg"] Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.699361 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-z6zgv"] Feb 19 22:59:38 crc kubenswrapper[4795]: E0219 22:59:38.699787 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" containerName="init" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.699804 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" containerName="init" Feb 19 22:59:38 crc kubenswrapper[4795]: E0219 22:59:38.699819 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" containerName="dnsmasq-dns" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.699827 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" containerName="dnsmasq-dns" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.699993 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" containerName="dnsmasq-dns" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.700607 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.719702 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-z6zgv"] Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.734612 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-0a84-account-create-update-pl5gt"] Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.735732 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.738234 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.776250 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0a84-account-create-update-pl5gt"] Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.856321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf9mm\" (UniqueName: \"kubernetes.io/projected/6eae15d3-0be7-4510-9803-a7ad3f947148-kube-api-access-bf9mm\") pod \"cinder-db-create-z6zgv\" (UID: \"6eae15d3-0be7-4510-9803-a7ad3f947148\") " pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.856415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dhb4\" (UniqueName: \"kubernetes.io/projected/de06c33e-b82b-46eb-964b-4bdd02c94166-kube-api-access-9dhb4\") pod \"cinder-0a84-account-create-update-pl5gt\" (UID: \"de06c33e-b82b-46eb-964b-4bdd02c94166\") " pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.856456 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de06c33e-b82b-46eb-964b-4bdd02c94166-operator-scripts\") pod \"cinder-0a84-account-create-update-pl5gt\" (UID: \"de06c33e-b82b-46eb-964b-4bdd02c94166\") " pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.856478 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eae15d3-0be7-4510-9803-a7ad3f947148-operator-scripts\") pod \"cinder-db-create-z6zgv\" (UID: \"6eae15d3-0be7-4510-9803-a7ad3f947148\") " pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.958227 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de06c33e-b82b-46eb-964b-4bdd02c94166-operator-scripts\") pod \"cinder-0a84-account-create-update-pl5gt\" (UID: \"de06c33e-b82b-46eb-964b-4bdd02c94166\") " pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.958276 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eae15d3-0be7-4510-9803-a7ad3f947148-operator-scripts\") pod \"cinder-db-create-z6zgv\" (UID: \"6eae15d3-0be7-4510-9803-a7ad3f947148\") " pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.958362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf9mm\" (UniqueName: \"kubernetes.io/projected/6eae15d3-0be7-4510-9803-a7ad3f947148-kube-api-access-bf9mm\") pod \"cinder-db-create-z6zgv\" (UID: \"6eae15d3-0be7-4510-9803-a7ad3f947148\") " pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.958413 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dhb4\" (UniqueName: \"kubernetes.io/projected/de06c33e-b82b-46eb-964b-4bdd02c94166-kube-api-access-9dhb4\") pod \"cinder-0a84-account-create-update-pl5gt\" (UID: \"de06c33e-b82b-46eb-964b-4bdd02c94166\") " pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.958885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de06c33e-b82b-46eb-964b-4bdd02c94166-operator-scripts\") pod \"cinder-0a84-account-create-update-pl5gt\" (UID: \"de06c33e-b82b-46eb-964b-4bdd02c94166\") " pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.959309 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eae15d3-0be7-4510-9803-a7ad3f947148-operator-scripts\") pod \"cinder-db-create-z6zgv\" (UID: \"6eae15d3-0be7-4510-9803-a7ad3f947148\") " pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.973820 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dhb4\" (UniqueName: \"kubernetes.io/projected/de06c33e-b82b-46eb-964b-4bdd02c94166-kube-api-access-9dhb4\") pod \"cinder-0a84-account-create-update-pl5gt\" (UID: \"de06c33e-b82b-46eb-964b-4bdd02c94166\") " pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:38 crc kubenswrapper[4795]: I0219 22:59:38.974442 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf9mm\" (UniqueName: \"kubernetes.io/projected/6eae15d3-0be7-4510-9803-a7ad3f947148-kube-api-access-bf9mm\") pod \"cinder-db-create-z6zgv\" (UID: \"6eae15d3-0be7-4510-9803-a7ad3f947148\") " pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:39 crc kubenswrapper[4795]: I0219 22:59:39.016292 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:39 crc kubenswrapper[4795]: I0219 22:59:39.059101 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:39 crc kubenswrapper[4795]: I0219 22:59:39.387340 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0a84-account-create-update-pl5gt"] Feb 19 22:59:39 crc kubenswrapper[4795]: I0219 22:59:39.410455 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0a84-account-create-update-pl5gt" event={"ID":"de06c33e-b82b-46eb-964b-4bdd02c94166","Type":"ContainerStarted","Data":"994224a55a32aedbb842e33e8f9a0f61c34196b1a61e74254cb494cf650cadb0"} Feb 19 22:59:39 crc kubenswrapper[4795]: I0219 22:59:39.468617 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-z6zgv"] Feb 19 22:59:39 crc kubenswrapper[4795]: W0219 22:59:39.469907 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6eae15d3_0be7_4510_9803_a7ad3f947148.slice/crio-0fba21faa525c124f9a44b646f17455f1be57821e6a84476adf2f12852386511 WatchSource:0}: Error finding container 0fba21faa525c124f9a44b646f17455f1be57821e6a84476adf2f12852386511: Status 404 returned error can't find the container with id 0fba21faa525c124f9a44b646f17455f1be57821e6a84476adf2f12852386511 Feb 19 22:59:39 crc kubenswrapper[4795]: I0219 22:59:39.522484 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff79db5-8006-4a3f-bb73-ab6e32d93186" path="/var/lib/kubelet/pods/2ff79db5-8006-4a3f-bb73-ab6e32d93186/volumes" Feb 19 22:59:40 crc kubenswrapper[4795]: I0219 22:59:40.420159 4795 generic.go:334] "Generic (PLEG): container finished" podID="de06c33e-b82b-46eb-964b-4bdd02c94166" containerID="87dbef2ca275f0dcf2d6fcb445371c808cbb03bd9ce8927982987b0e668dfab9" exitCode=0 Feb 19 22:59:40 crc kubenswrapper[4795]: I0219 22:59:40.420256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0a84-account-create-update-pl5gt" event={"ID":"de06c33e-b82b-46eb-964b-4bdd02c94166","Type":"ContainerDied","Data":"87dbef2ca275f0dcf2d6fcb445371c808cbb03bd9ce8927982987b0e668dfab9"} Feb 19 22:59:40 crc kubenswrapper[4795]: I0219 22:59:40.422987 4795 generic.go:334] "Generic (PLEG): container finished" podID="6eae15d3-0be7-4510-9803-a7ad3f947148" containerID="dc19c0a9eee505e65f65e0357256fb1d1ef9373c082944f9697154c21d026163" exitCode=0 Feb 19 22:59:40 crc kubenswrapper[4795]: I0219 22:59:40.423075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-z6zgv" event={"ID":"6eae15d3-0be7-4510-9803-a7ad3f947148","Type":"ContainerDied","Data":"dc19c0a9eee505e65f65e0357256fb1d1ef9373c082944f9697154c21d026163"} Feb 19 22:59:40 crc kubenswrapper[4795]: I0219 22:59:40.423116 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-z6zgv" event={"ID":"6eae15d3-0be7-4510-9803-a7ad3f947148","Type":"ContainerStarted","Data":"0fba21faa525c124f9a44b646f17455f1be57821e6a84476adf2f12852386511"} Feb 19 22:59:41 crc kubenswrapper[4795]: I0219 22:59:41.769361 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:41 crc kubenswrapper[4795]: I0219 22:59:41.845891 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:41 crc kubenswrapper[4795]: I0219 22:59:41.907760 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de06c33e-b82b-46eb-964b-4bdd02c94166-operator-scripts\") pod \"de06c33e-b82b-46eb-964b-4bdd02c94166\" (UID: \"de06c33e-b82b-46eb-964b-4bdd02c94166\") " Feb 19 22:59:41 crc kubenswrapper[4795]: I0219 22:59:41.908006 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dhb4\" (UniqueName: \"kubernetes.io/projected/de06c33e-b82b-46eb-964b-4bdd02c94166-kube-api-access-9dhb4\") pod \"de06c33e-b82b-46eb-964b-4bdd02c94166\" (UID: \"de06c33e-b82b-46eb-964b-4bdd02c94166\") " Feb 19 22:59:41 crc kubenswrapper[4795]: I0219 22:59:41.908255 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de06c33e-b82b-46eb-964b-4bdd02c94166-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de06c33e-b82b-46eb-964b-4bdd02c94166" (UID: "de06c33e-b82b-46eb-964b-4bdd02c94166"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:59:41 crc kubenswrapper[4795]: I0219 22:59:41.908848 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de06c33e-b82b-46eb-964b-4bdd02c94166-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:41 crc kubenswrapper[4795]: I0219 22:59:41.913446 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de06c33e-b82b-46eb-964b-4bdd02c94166-kube-api-access-9dhb4" (OuterVolumeSpecName: "kube-api-access-9dhb4") pod "de06c33e-b82b-46eb-964b-4bdd02c94166" (UID: "de06c33e-b82b-46eb-964b-4bdd02c94166"). InnerVolumeSpecName "kube-api-access-9dhb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.010315 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf9mm\" (UniqueName: \"kubernetes.io/projected/6eae15d3-0be7-4510-9803-a7ad3f947148-kube-api-access-bf9mm\") pod \"6eae15d3-0be7-4510-9803-a7ad3f947148\" (UID: \"6eae15d3-0be7-4510-9803-a7ad3f947148\") " Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.010401 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eae15d3-0be7-4510-9803-a7ad3f947148-operator-scripts\") pod \"6eae15d3-0be7-4510-9803-a7ad3f947148\" (UID: \"6eae15d3-0be7-4510-9803-a7ad3f947148\") " Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.010945 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dhb4\" (UniqueName: \"kubernetes.io/projected/de06c33e-b82b-46eb-964b-4bdd02c94166-kube-api-access-9dhb4\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.011017 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eae15d3-0be7-4510-9803-a7ad3f947148-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6eae15d3-0be7-4510-9803-a7ad3f947148" (UID: "6eae15d3-0be7-4510-9803-a7ad3f947148"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.012870 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eae15d3-0be7-4510-9803-a7ad3f947148-kube-api-access-bf9mm" (OuterVolumeSpecName: "kube-api-access-bf9mm") pod "6eae15d3-0be7-4510-9803-a7ad3f947148" (UID: "6eae15d3-0be7-4510-9803-a7ad3f947148"). InnerVolumeSpecName "kube-api-access-bf9mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.112679 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf9mm\" (UniqueName: \"kubernetes.io/projected/6eae15d3-0be7-4510-9803-a7ad3f947148-kube-api-access-bf9mm\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.112713 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eae15d3-0be7-4510-9803-a7ad3f947148-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.442194 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0a84-account-create-update-pl5gt" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.442194 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0a84-account-create-update-pl5gt" event={"ID":"de06c33e-b82b-46eb-964b-4bdd02c94166","Type":"ContainerDied","Data":"994224a55a32aedbb842e33e8f9a0f61c34196b1a61e74254cb494cf650cadb0"} Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.442310 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="994224a55a32aedbb842e33e8f9a0f61c34196b1a61e74254cb494cf650cadb0" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.448441 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-z6zgv" event={"ID":"6eae15d3-0be7-4510-9803-a7ad3f947148","Type":"ContainerDied","Data":"0fba21faa525c124f9a44b646f17455f1be57821e6a84476adf2f12852386511"} Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.448464 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fba21faa525c124f9a44b646f17455f1be57821e6a84476adf2f12852386511" Feb 19 22:59:42 crc kubenswrapper[4795]: I0219 22:59:42.448500 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z6zgv" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.050747 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-mn2kc"] Feb 19 22:59:44 crc kubenswrapper[4795]: E0219 22:59:44.054761 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eae15d3-0be7-4510-9803-a7ad3f947148" containerName="mariadb-database-create" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.054784 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eae15d3-0be7-4510-9803-a7ad3f947148" containerName="mariadb-database-create" Feb 19 22:59:44 crc kubenswrapper[4795]: E0219 22:59:44.054832 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de06c33e-b82b-46eb-964b-4bdd02c94166" containerName="mariadb-account-create-update" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.054842 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="de06c33e-b82b-46eb-964b-4bdd02c94166" containerName="mariadb-account-create-update" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.055040 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="de06c33e-b82b-46eb-964b-4bdd02c94166" containerName="mariadb-account-create-update" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.055053 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eae15d3-0be7-4510-9803-a7ad3f947148" containerName="mariadb-database-create" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.055776 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.057631 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.058809 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.059137 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tv4ht" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.059943 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mn2kc"] Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.083597 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51673183-2fe8-4a11-98f0-dec10081e7fc-etc-machine-id\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.083895 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-combined-ca-bundle\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.084153 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-config-data\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.084347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzxlv\" (UniqueName: \"kubernetes.io/projected/51673183-2fe8-4a11-98f0-dec10081e7fc-kube-api-access-rzxlv\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.084534 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-db-sync-config-data\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.084766 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-scripts\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.185915 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-scripts\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.186247 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51673183-2fe8-4a11-98f0-dec10081e7fc-etc-machine-id\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.186335 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51673183-2fe8-4a11-98f0-dec10081e7fc-etc-machine-id\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.186361 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-combined-ca-bundle\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.186621 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-config-data\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.186735 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzxlv\" (UniqueName: \"kubernetes.io/projected/51673183-2fe8-4a11-98f0-dec10081e7fc-kube-api-access-rzxlv\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.186854 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-db-sync-config-data\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.191816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-scripts\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.192006 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-combined-ca-bundle\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.192237 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-config-data\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.192875 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-db-sync-config-data\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.213252 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzxlv\" (UniqueName: \"kubernetes.io/projected/51673183-2fe8-4a11-98f0-dec10081e7fc-kube-api-access-rzxlv\") pod \"cinder-db-sync-mn2kc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.375672 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:44 crc kubenswrapper[4795]: I0219 22:59:44.862807 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mn2kc"] Feb 19 22:59:44 crc kubenswrapper[4795]: W0219 22:59:44.867228 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51673183_2fe8_4a11_98f0_dec10081e7fc.slice/crio-1831d0960e7fb41b83a5d61d8eac44fc4e5fcb9295a67af8e498aee6a42554a2 WatchSource:0}: Error finding container 1831d0960e7fb41b83a5d61d8eac44fc4e5fcb9295a67af8e498aee6a42554a2: Status 404 returned error can't find the container with id 1831d0960e7fb41b83a5d61d8eac44fc4e5fcb9295a67af8e498aee6a42554a2 Feb 19 22:59:45 crc kubenswrapper[4795]: I0219 22:59:45.493549 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mn2kc" event={"ID":"51673183-2fe8-4a11-98f0-dec10081e7fc","Type":"ContainerStarted","Data":"1831d0960e7fb41b83a5d61d8eac44fc4e5fcb9295a67af8e498aee6a42554a2"} Feb 19 22:59:46 crc kubenswrapper[4795]: I0219 22:59:46.504429 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mn2kc" event={"ID":"51673183-2fe8-4a11-98f0-dec10081e7fc","Type":"ContainerStarted","Data":"ad3831db9a9f6e12a5ac1eb158cce87c770b2e137cadf64a7ed412c1aedd54c2"} Feb 19 22:59:46 crc kubenswrapper[4795]: I0219 22:59:46.521079 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-mn2kc" podStartSLOduration=2.521062846 podStartE2EDuration="2.521062846s" podCreationTimestamp="2026-02-19 22:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:59:46.519399314 +0000 UTC m=+5497.711917198" watchObservedRunningTime="2026-02-19 22:59:46.521062846 +0000 UTC m=+5497.713580710" Feb 19 22:59:48 crc kubenswrapper[4795]: I0219 22:59:48.520543 4795 generic.go:334] "Generic (PLEG): container finished" podID="51673183-2fe8-4a11-98f0-dec10081e7fc" containerID="ad3831db9a9f6e12a5ac1eb158cce87c770b2e137cadf64a7ed412c1aedd54c2" exitCode=0 Feb 19 22:59:48 crc kubenswrapper[4795]: I0219 22:59:48.520615 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mn2kc" event={"ID":"51673183-2fe8-4a11-98f0-dec10081e7fc","Type":"ContainerDied","Data":"ad3831db9a9f6e12a5ac1eb158cce87c770b2e137cadf64a7ed412c1aedd54c2"} Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.903622 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.986955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-combined-ca-bundle\") pod \"51673183-2fe8-4a11-98f0-dec10081e7fc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.987049 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-db-sync-config-data\") pod \"51673183-2fe8-4a11-98f0-dec10081e7fc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.987070 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51673183-2fe8-4a11-98f0-dec10081e7fc-etc-machine-id\") pod \"51673183-2fe8-4a11-98f0-dec10081e7fc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.987201 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-scripts\") pod \"51673183-2fe8-4a11-98f0-dec10081e7fc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.987268 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzxlv\" (UniqueName: \"kubernetes.io/projected/51673183-2fe8-4a11-98f0-dec10081e7fc-kube-api-access-rzxlv\") pod \"51673183-2fe8-4a11-98f0-dec10081e7fc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.987292 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-config-data\") pod \"51673183-2fe8-4a11-98f0-dec10081e7fc\" (UID: \"51673183-2fe8-4a11-98f0-dec10081e7fc\") " Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.987432 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51673183-2fe8-4a11-98f0-dec10081e7fc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "51673183-2fe8-4a11-98f0-dec10081e7fc" (UID: "51673183-2fe8-4a11-98f0-dec10081e7fc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.987867 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51673183-2fe8-4a11-98f0-dec10081e7fc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.993443 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-scripts" (OuterVolumeSpecName: "scripts") pod "51673183-2fe8-4a11-98f0-dec10081e7fc" (UID: "51673183-2fe8-4a11-98f0-dec10081e7fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.996514 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51673183-2fe8-4a11-98f0-dec10081e7fc-kube-api-access-rzxlv" (OuterVolumeSpecName: "kube-api-access-rzxlv") pod "51673183-2fe8-4a11-98f0-dec10081e7fc" (UID: "51673183-2fe8-4a11-98f0-dec10081e7fc"). InnerVolumeSpecName "kube-api-access-rzxlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 22:59:49 crc kubenswrapper[4795]: I0219 22:59:49.996596 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "51673183-2fe8-4a11-98f0-dec10081e7fc" (UID: "51673183-2fe8-4a11-98f0-dec10081e7fc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.012826 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51673183-2fe8-4a11-98f0-dec10081e7fc" (UID: "51673183-2fe8-4a11-98f0-dec10081e7fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.044335 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-config-data" (OuterVolumeSpecName: "config-data") pod "51673183-2fe8-4a11-98f0-dec10081e7fc" (UID: "51673183-2fe8-4a11-98f0-dec10081e7fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.089759 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.089790 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.089800 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.089810 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51673183-2fe8-4a11-98f0-dec10081e7fc-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.089819 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzxlv\" (UniqueName: \"kubernetes.io/projected/51673183-2fe8-4a11-98f0-dec10081e7fc-kube-api-access-rzxlv\") on node \"crc\" DevicePath \"\"" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.542956 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mn2kc" event={"ID":"51673183-2fe8-4a11-98f0-dec10081e7fc","Type":"ContainerDied","Data":"1831d0960e7fb41b83a5d61d8eac44fc4e5fcb9295a67af8e498aee6a42554a2"} Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.543001 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1831d0960e7fb41b83a5d61d8eac44fc4e5fcb9295a67af8e498aee6a42554a2" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.543064 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mn2kc" Feb 19 22:59:50 crc kubenswrapper[4795]: E0219 22:59:50.723327 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51673183_2fe8_4a11_98f0_dec10081e7fc.slice\": RecentStats: unable to find data in memory cache]" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.893859 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9677b4c57-7nn9w"] Feb 19 22:59:50 crc kubenswrapper[4795]: E0219 22:59:50.894625 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51673183-2fe8-4a11-98f0-dec10081e7fc" containerName="cinder-db-sync" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.894649 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="51673183-2fe8-4a11-98f0-dec10081e7fc" containerName="cinder-db-sync" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.894867 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="51673183-2fe8-4a11-98f0-dec10081e7fc" containerName="cinder-db-sync" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.895995 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.905256 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-dns-svc\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.907154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-config\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.907834 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc6jd\" (UniqueName: \"kubernetes.io/projected/b20710ae-8abe-4d80-8cdf-582fe785e2cc-kube-api-access-vc6jd\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.907976 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-nb\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:50 crc kubenswrapper[4795]: I0219 22:59:50.908143 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-sb\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.010191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-config\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.010269 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc6jd\" (UniqueName: \"kubernetes.io/projected/b20710ae-8abe-4d80-8cdf-582fe785e2cc-kube-api-access-vc6jd\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.010294 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-nb\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.010330 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-sb\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.010349 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-dns-svc\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.011264 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-dns-svc\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.011345 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-nb\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.011513 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-sb\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.017225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-config\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.019832 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9677b4c57-7nn9w"] Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.046003 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc6jd\" (UniqueName: \"kubernetes.io/projected/b20710ae-8abe-4d80-8cdf-582fe785e2cc-kube-api-access-vc6jd\") pod \"dnsmasq-dns-9677b4c57-7nn9w\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.130266 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.131723 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.142928 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.143646 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.143924 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.144037 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tv4ht" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.144243 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.220821 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3922ed4f-baf9-481e-af8b-b009440dfea2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.220879 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3922ed4f-baf9-481e-af8b-b009440dfea2-logs\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.220930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.220962 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-scripts\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.221072 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.221158 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thkh7\" (UniqueName: \"kubernetes.io/projected/3922ed4f-baf9-481e-af8b-b009440dfea2-kube-api-access-thkh7\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.221201 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data-custom\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.222333 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.326105 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thkh7\" (UniqueName: \"kubernetes.io/projected/3922ed4f-baf9-481e-af8b-b009440dfea2-kube-api-access-thkh7\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.326410 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data-custom\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.326491 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3922ed4f-baf9-481e-af8b-b009440dfea2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.326509 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3922ed4f-baf9-481e-af8b-b009440dfea2-logs\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.326532 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.326550 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-scripts\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.326592 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.327455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3922ed4f-baf9-481e-af8b-b009440dfea2-logs\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.327516 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3922ed4f-baf9-481e-af8b-b009440dfea2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.335947 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-scripts\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.336286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.336919 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data-custom\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.346509 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.357731 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thkh7\" (UniqueName: \"kubernetes.io/projected/3922ed4f-baf9-481e-af8b-b009440dfea2-kube-api-access-thkh7\") pod \"cinder-api-0\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.504829 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 22:59:51 crc kubenswrapper[4795]: W0219 22:59:51.771600 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb20710ae_8abe_4d80_8cdf_582fe785e2cc.slice/crio-676d66baf2862b20915b7f3f4da6df9613656d85d389c8b7a38b315539d7e5e8 WatchSource:0}: Error finding container 676d66baf2862b20915b7f3f4da6df9613656d85d389c8b7a38b315539d7e5e8: Status 404 returned error can't find the container with id 676d66baf2862b20915b7f3f4da6df9613656d85d389c8b7a38b315539d7e5e8 Feb 19 22:59:51 crc kubenswrapper[4795]: I0219 22:59:51.780051 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9677b4c57-7nn9w"] Feb 19 22:59:52 crc kubenswrapper[4795]: I0219 22:59:52.040123 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 22:59:52 crc kubenswrapper[4795]: W0219 22:59:52.051425 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3922ed4f_baf9_481e_af8b_b009440dfea2.slice/crio-595aeff6e650976b06c39b4880224d655a58680d4787becedbb202c16d33fdcb WatchSource:0}: Error finding container 595aeff6e650976b06c39b4880224d655a58680d4787becedbb202c16d33fdcb: Status 404 returned error can't find the container with id 595aeff6e650976b06c39b4880224d655a58680d4787becedbb202c16d33fdcb Feb 19 22:59:52 crc kubenswrapper[4795]: I0219 22:59:52.564685 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3922ed4f-baf9-481e-af8b-b009440dfea2","Type":"ContainerStarted","Data":"595aeff6e650976b06c39b4880224d655a58680d4787becedbb202c16d33fdcb"} Feb 19 22:59:52 crc kubenswrapper[4795]: I0219 22:59:52.566855 4795 generic.go:334] "Generic (PLEG): container finished" podID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" containerID="b799c6817a13193660264c0e1a6b6bbda173865957b6b9f053b5f17c7df00f19" exitCode=0 Feb 19 22:59:52 crc kubenswrapper[4795]: I0219 22:59:52.566950 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" event={"ID":"b20710ae-8abe-4d80-8cdf-582fe785e2cc","Type":"ContainerDied","Data":"b799c6817a13193660264c0e1a6b6bbda173865957b6b9f053b5f17c7df00f19"} Feb 19 22:59:52 crc kubenswrapper[4795]: I0219 22:59:52.566968 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" event={"ID":"b20710ae-8abe-4d80-8cdf-582fe785e2cc","Type":"ContainerStarted","Data":"676d66baf2862b20915b7f3f4da6df9613656d85d389c8b7a38b315539d7e5e8"} Feb 19 22:59:53 crc kubenswrapper[4795]: I0219 22:59:53.577913 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3922ed4f-baf9-481e-af8b-b009440dfea2","Type":"ContainerStarted","Data":"776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7"} Feb 19 22:59:53 crc kubenswrapper[4795]: I0219 22:59:53.578372 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 22:59:53 crc kubenswrapper[4795]: I0219 22:59:53.578385 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3922ed4f-baf9-481e-af8b-b009440dfea2","Type":"ContainerStarted","Data":"aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930"} Feb 19 22:59:53 crc kubenswrapper[4795]: I0219 22:59:53.580286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" event={"ID":"b20710ae-8abe-4d80-8cdf-582fe785e2cc","Type":"ContainerStarted","Data":"af38521e1f3eb2665f893506c1a46deb190e71a060f854d967c38b1e5c2dbb46"} Feb 19 22:59:53 crc kubenswrapper[4795]: I0219 22:59:53.580886 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 22:59:53 crc kubenswrapper[4795]: I0219 22:59:53.597368 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.597347517 podStartE2EDuration="2.597347517s" podCreationTimestamp="2026-02-19 22:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:59:53.593339034 +0000 UTC m=+5504.785856908" watchObservedRunningTime="2026-02-19 22:59:53.597347517 +0000 UTC m=+5504.789865401" Feb 19 22:59:53 crc kubenswrapper[4795]: I0219 22:59:53.622312 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" podStartSLOduration=3.622294061 podStartE2EDuration="3.622294061s" podCreationTimestamp="2026-02-19 22:59:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 22:59:53.616708789 +0000 UTC m=+5504.809226643" watchObservedRunningTime="2026-02-19 22:59:53.622294061 +0000 UTC m=+5504.814811925" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.134889 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68"] Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.136820 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.144974 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.144974 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.151039 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68"] Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.241730 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40381732-f007-4395-b8d1-02b3fc37b091-config-volume\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.241913 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqwqj\" (UniqueName: \"kubernetes.io/projected/40381732-f007-4395-b8d1-02b3fc37b091-kube-api-access-fqwqj\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.242000 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40381732-f007-4395-b8d1-02b3fc37b091-secret-volume\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.343444 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40381732-f007-4395-b8d1-02b3fc37b091-config-volume\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.343806 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqwqj\" (UniqueName: \"kubernetes.io/projected/40381732-f007-4395-b8d1-02b3fc37b091-kube-api-access-fqwqj\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.343988 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40381732-f007-4395-b8d1-02b3fc37b091-secret-volume\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.344526 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40381732-f007-4395-b8d1-02b3fc37b091-config-volume\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.353885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40381732-f007-4395-b8d1-02b3fc37b091-secret-volume\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.358179 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqwqj\" (UniqueName: \"kubernetes.io/projected/40381732-f007-4395-b8d1-02b3fc37b091-kube-api-access-fqwqj\") pod \"collect-profiles-29525700-psw68\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.458908 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:00 crc kubenswrapper[4795]: I0219 23:00:00.921523 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68"] Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.231309 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.303653 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bd886b897-t5785"] Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.303905 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bd886b897-t5785" podUID="25125096-9221-46cf-9c10-21242922dc39" containerName="dnsmasq-dns" containerID="cri-o://6b2bdae1ece62dc20662e376419faadb82e3b42065ea06a9558f606c33c4b7d2" gracePeriod=10 Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.683637 4795 generic.go:334] "Generic (PLEG): container finished" podID="25125096-9221-46cf-9c10-21242922dc39" containerID="6b2bdae1ece62dc20662e376419faadb82e3b42065ea06a9558f606c33c4b7d2" exitCode=0 Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.683718 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd886b897-t5785" event={"ID":"25125096-9221-46cf-9c10-21242922dc39","Type":"ContainerDied","Data":"6b2bdae1ece62dc20662e376419faadb82e3b42065ea06a9558f606c33c4b7d2"} Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.708953 4795 generic.go:334] "Generic (PLEG): container finished" podID="40381732-f007-4395-b8d1-02b3fc37b091" containerID="f0a7532029d7c277b9d534f1fee42695eb4906e52f264e18f9db0ba49bbbdeb7" exitCode=0 Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.709268 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" event={"ID":"40381732-f007-4395-b8d1-02b3fc37b091","Type":"ContainerDied","Data":"f0a7532029d7c277b9d534f1fee42695eb4906e52f264e18f9db0ba49bbbdeb7"} Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.709293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" event={"ID":"40381732-f007-4395-b8d1-02b3fc37b091","Type":"ContainerStarted","Data":"8765f9a21122d91b62195625103e3e18985cebc537953bba019e930ab4a19f4c"} Feb 19 23:00:01 crc kubenswrapper[4795]: I0219 23:00:01.919215 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.074695 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-nb\") pod \"25125096-9221-46cf-9c10-21242922dc39\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.074748 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-dns-svc\") pod \"25125096-9221-46cf-9c10-21242922dc39\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.074876 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmqvw\" (UniqueName: \"kubernetes.io/projected/25125096-9221-46cf-9c10-21242922dc39-kube-api-access-jmqvw\") pod \"25125096-9221-46cf-9c10-21242922dc39\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.074919 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-sb\") pod \"25125096-9221-46cf-9c10-21242922dc39\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.074975 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-config\") pod \"25125096-9221-46cf-9c10-21242922dc39\" (UID: \"25125096-9221-46cf-9c10-21242922dc39\") " Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.098315 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25125096-9221-46cf-9c10-21242922dc39-kube-api-access-jmqvw" (OuterVolumeSpecName: "kube-api-access-jmqvw") pod "25125096-9221-46cf-9c10-21242922dc39" (UID: "25125096-9221-46cf-9c10-21242922dc39"). InnerVolumeSpecName "kube-api-access-jmqvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.120593 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25125096-9221-46cf-9c10-21242922dc39" (UID: "25125096-9221-46cf-9c10-21242922dc39"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.125730 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-config" (OuterVolumeSpecName: "config") pod "25125096-9221-46cf-9c10-21242922dc39" (UID: "25125096-9221-46cf-9c10-21242922dc39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.135665 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25125096-9221-46cf-9c10-21242922dc39" (UID: "25125096-9221-46cf-9c10-21242922dc39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.144597 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25125096-9221-46cf-9c10-21242922dc39" (UID: "25125096-9221-46cf-9c10-21242922dc39"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.179599 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.179627 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.179637 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmqvw\" (UniqueName: \"kubernetes.io/projected/25125096-9221-46cf-9c10-21242922dc39-kube-api-access-jmqvw\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.179647 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.179656 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25125096-9221-46cf-9c10-21242922dc39-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.720736 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd886b897-t5785" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.721078 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd886b897-t5785" event={"ID":"25125096-9221-46cf-9c10-21242922dc39","Type":"ContainerDied","Data":"cb323091e0b46dedb3513673e4bcc0af0bdc32ed871a599a68f0f38aa938ff1f"} Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.721209 4795 scope.go:117] "RemoveContainer" containerID="6b2bdae1ece62dc20662e376419faadb82e3b42065ea06a9558f606c33c4b7d2" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.759883 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bd886b897-t5785"] Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.770665 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bd886b897-t5785"] Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.772509 4795 scope.go:117] "RemoveContainer" containerID="1208585385cb268d9e64c92ccccb01209cc9892bf210d3382d48ae16872a3290" Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.897444 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.897921 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f7e72169-9bdf-4b8f-9f8d-ca2a994287da" containerName="nova-scheduler-scheduler" containerID="cri-o://44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5" gracePeriod=30 Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.935542 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.935917 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="9839fd0b-0161-4772-bda3-ddc2914d7e83" containerName="nova-cell0-conductor-conductor" containerID="cri-o://23d5580181fbc629a99f1fcb391f9c4c8a3cc847543b9789039eab2e110e9ef3" gracePeriod=30 Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.950993 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.951262 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-log" containerID="cri-o://3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e" gracePeriod=30 Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.951659 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-metadata" containerID="cri-o://a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6" gracePeriod=30 Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.961225 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.961604 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-log" containerID="cri-o://9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff" gracePeriod=30 Feb 19 23:00:02 crc kubenswrapper[4795]: I0219 23:00:02.962076 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-api" containerID="cri-o://6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f" gracePeriod=30 Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:02.997398 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.004212 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4cb474362c7411d131561c37cda83e0ffd7519207d531dd518914dff39a66f87" gracePeriod=30 Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.062450 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.062628 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="cb0fc8ea-d689-468f-a612-b87c3e63077d" containerName="nova-cell1-conductor-conductor" containerID="cri-o://59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314" gracePeriod=30 Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.170206 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.310234 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40381732-f007-4395-b8d1-02b3fc37b091-secret-volume\") pod \"40381732-f007-4395-b8d1-02b3fc37b091\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.310333 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqwqj\" (UniqueName: \"kubernetes.io/projected/40381732-f007-4395-b8d1-02b3fc37b091-kube-api-access-fqwqj\") pod \"40381732-f007-4395-b8d1-02b3fc37b091\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.310412 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40381732-f007-4395-b8d1-02b3fc37b091-config-volume\") pod \"40381732-f007-4395-b8d1-02b3fc37b091\" (UID: \"40381732-f007-4395-b8d1-02b3fc37b091\") " Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.311536 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40381732-f007-4395-b8d1-02b3fc37b091-config-volume" (OuterVolumeSpecName: "config-volume") pod "40381732-f007-4395-b8d1-02b3fc37b091" (UID: "40381732-f007-4395-b8d1-02b3fc37b091"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.319475 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40381732-f007-4395-b8d1-02b3fc37b091-kube-api-access-fqwqj" (OuterVolumeSpecName: "kube-api-access-fqwqj") pod "40381732-f007-4395-b8d1-02b3fc37b091" (UID: "40381732-f007-4395-b8d1-02b3fc37b091"). InnerVolumeSpecName "kube-api-access-fqwqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.327440 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40381732-f007-4395-b8d1-02b3fc37b091-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "40381732-f007-4395-b8d1-02b3fc37b091" (UID: "40381732-f007-4395-b8d1-02b3fc37b091"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.412400 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40381732-f007-4395-b8d1-02b3fc37b091-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.412440 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqwqj\" (UniqueName: \"kubernetes.io/projected/40381732-f007-4395-b8d1-02b3fc37b091-kube-api-access-fqwqj\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.412454 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40381732-f007-4395-b8d1-02b3fc37b091-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.544118 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25125096-9221-46cf-9c10-21242922dc39" path="/var/lib/kubelet/pods/25125096-9221-46cf-9c10-21242922dc39/volumes" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.738382 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.739453 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68" event={"ID":"40381732-f007-4395-b8d1-02b3fc37b091","Type":"ContainerDied","Data":"8765f9a21122d91b62195625103e3e18985cebc537953bba019e930ab4a19f4c"} Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.739508 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8765f9a21122d91b62195625103e3e18985cebc537953bba019e930ab4a19f4c" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.747730 4795 generic.go:334] "Generic (PLEG): container finished" podID="30b4399f-d971-4131-8243-3c45e8353cdd" containerID="9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff" exitCode=143 Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.747832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30b4399f-d971-4131-8243-3c45e8353cdd","Type":"ContainerDied","Data":"9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff"} Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.750547 4795 generic.go:334] "Generic (PLEG): container finished" podID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerID="3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e" exitCode=143 Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.750604 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a","Type":"ContainerDied","Data":"3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e"} Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.751848 4795 generic.go:334] "Generic (PLEG): container finished" podID="53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" containerID="4cb474362c7411d131561c37cda83e0ffd7519207d531dd518914dff39a66f87" exitCode=0 Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.751874 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09","Type":"ContainerDied","Data":"4cb474362c7411d131561c37cda83e0ffd7519207d531dd518914dff39a66f87"} Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.751892 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09","Type":"ContainerDied","Data":"3cb295d52ddcf76c340ae392323cec591a88fdaec3fd107d8ed54655b0b89d43"} Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.751905 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cb295d52ddcf76c340ae392323cec591a88fdaec3fd107d8ed54655b0b89d43" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.766300 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.792639 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.925012 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7jkp\" (UniqueName: \"kubernetes.io/projected/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-kube-api-access-f7jkp\") pod \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.925086 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-config-data\") pod \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.925247 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-combined-ca-bundle\") pod \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\" (UID: \"53af86cc-b0d0-4ba5-9294-4aaa6cef6c09\") " Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.942009 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-kube-api-access-f7jkp" (OuterVolumeSpecName: "kube-api-access-f7jkp") pod "53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" (UID: "53af86cc-b0d0-4ba5-9294-4aaa6cef6c09"). InnerVolumeSpecName "kube-api-access-f7jkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.962570 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" (UID: "53af86cc-b0d0-4ba5-9294-4aaa6cef6c09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:03 crc kubenswrapper[4795]: I0219 23:00:03.981181 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-config-data" (OuterVolumeSpecName: "config-data") pod "53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" (UID: "53af86cc-b0d0-4ba5-9294-4aaa6cef6c09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.028627 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.028748 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.028817 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7jkp\" (UniqueName: \"kubernetes.io/projected/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09-kube-api-access-f7jkp\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.319235 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv"] Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.336517 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525655-2s9rv"] Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.348621 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.439755 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gp8x\" (UniqueName: \"kubernetes.io/projected/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-kube-api-access-7gp8x\") pod \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.439971 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-config-data\") pod \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.440008 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-combined-ca-bundle\") pod \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\" (UID: \"f7e72169-9bdf-4b8f-9f8d-ca2a994287da\") " Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.450385 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-kube-api-access-7gp8x" (OuterVolumeSpecName: "kube-api-access-7gp8x") pod "f7e72169-9bdf-4b8f-9f8d-ca2a994287da" (UID: "f7e72169-9bdf-4b8f-9f8d-ca2a994287da"). InnerVolumeSpecName "kube-api-access-7gp8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.466832 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-config-data" (OuterVolumeSpecName: "config-data") pod "f7e72169-9bdf-4b8f-9f8d-ca2a994287da" (UID: "f7e72169-9bdf-4b8f-9f8d-ca2a994287da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.484907 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7e72169-9bdf-4b8f-9f8d-ca2a994287da" (UID: "f7e72169-9bdf-4b8f-9f8d-ca2a994287da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.542603 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.542642 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.542656 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gp8x\" (UniqueName: \"kubernetes.io/projected/f7e72169-9bdf-4b8f-9f8d-ca2a994287da-kube-api-access-7gp8x\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.761964 4795 generic.go:334] "Generic (PLEG): container finished" podID="f7e72169-9bdf-4b8f-9f8d-ca2a994287da" containerID="44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5" exitCode=0 Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.762013 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f7e72169-9bdf-4b8f-9f8d-ca2a994287da","Type":"ContainerDied","Data":"44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5"} Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.762058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f7e72169-9bdf-4b8f-9f8d-ca2a994287da","Type":"ContainerDied","Data":"914300143cab992575f14bc983146583fd096f4199ceab449ccc8d0f6f188747"} Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.762061 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.762074 4795 scope.go:117] "RemoveContainer" containerID="44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.762070 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.793901 4795 scope.go:117] "RemoveContainer" containerID="44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5" Feb 19 23:00:04 crc kubenswrapper[4795]: E0219 23:00:04.794997 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5\": container with ID starting with 44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5 not found: ID does not exist" containerID="44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.795117 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5"} err="failed to get container status \"44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5\": rpc error: code = NotFound desc = could not find container \"44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5\": container with ID starting with 44d4ca3c4aa7ee9ce5e10ecd6a647b96c7526305cf1a590a9d7e1995ec6eaea5 not found: ID does not exist" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.818676 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.847093 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.864740 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.902295 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.914305 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:00:04 crc kubenswrapper[4795]: E0219 23:00:04.915077 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40381732-f007-4395-b8d1-02b3fc37b091" containerName="collect-profiles" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.915098 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="40381732-f007-4395-b8d1-02b3fc37b091" containerName="collect-profiles" Feb 19 23:00:04 crc kubenswrapper[4795]: E0219 23:00:04.915109 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e72169-9bdf-4b8f-9f8d-ca2a994287da" containerName="nova-scheduler-scheduler" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.915115 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e72169-9bdf-4b8f-9f8d-ca2a994287da" containerName="nova-scheduler-scheduler" Feb 19 23:00:04 crc kubenswrapper[4795]: E0219 23:00:04.915130 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25125096-9221-46cf-9c10-21242922dc39" containerName="dnsmasq-dns" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.915138 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="25125096-9221-46cf-9c10-21242922dc39" containerName="dnsmasq-dns" Feb 19 23:00:04 crc kubenswrapper[4795]: E0219 23:00:04.915151 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25125096-9221-46cf-9c10-21242922dc39" containerName="init" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.915157 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="25125096-9221-46cf-9c10-21242922dc39" containerName="init" Feb 19 23:00:04 crc kubenswrapper[4795]: E0219 23:00:04.918349 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.918378 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.918780 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="40381732-f007-4395-b8d1-02b3fc37b091" containerName="collect-profiles" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.918922 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e72169-9bdf-4b8f-9f8d-ca2a994287da" containerName="nova-scheduler-scheduler" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.918937 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.918951 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="25125096-9221-46cf-9c10-21242922dc39" containerName="dnsmasq-dns" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.919759 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.922003 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.935565 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.948979 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.950152 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.952445 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 23:00:04 crc kubenswrapper[4795]: I0219 23:00:04.956528 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.052079 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dpxs\" (UniqueName: \"kubernetes.io/projected/2eb28a2e-eb12-4867-9c26-3416349cc1cc-kube-api-access-4dpxs\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.052152 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwh68\" (UniqueName: \"kubernetes.io/projected/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-kube-api-access-xwh68\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.052193 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-config-data\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.052218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.052235 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.052309 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.153515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.153588 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dpxs\" (UniqueName: \"kubernetes.io/projected/2eb28a2e-eb12-4867-9c26-3416349cc1cc-kube-api-access-4dpxs\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.153633 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwh68\" (UniqueName: \"kubernetes.io/projected/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-kube-api-access-xwh68\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.153674 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-config-data\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.153704 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.153724 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.159682 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.161363 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-config-data\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.164775 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.169592 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dpxs\" (UniqueName: \"kubernetes.io/projected/2eb28a2e-eb12-4867-9c26-3416349cc1cc-kube-api-access-4dpxs\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.170447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwh68\" (UniqueName: \"kubernetes.io/projected/a96a8189-2b04-4ce7-908b-3544dc3b7ec4-kube-api-access-xwh68\") pod \"nova-cell1-novncproxy-0\" (UID: \"a96a8189-2b04-4ce7-908b-3544dc3b7ec4\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.181714 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.259601 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.325859 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.333521 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.358470 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-config-data\") pod \"cb0fc8ea-d689-468f-a612-b87c3e63077d\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.358619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9np5\" (UniqueName: \"kubernetes.io/projected/cb0fc8ea-d689-468f-a612-b87c3e63077d-kube-api-access-w9np5\") pod \"cb0fc8ea-d689-468f-a612-b87c3e63077d\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.358654 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-combined-ca-bundle\") pod \"cb0fc8ea-d689-468f-a612-b87c3e63077d\" (UID: \"cb0fc8ea-d689-468f-a612-b87c3e63077d\") " Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.369039 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb0fc8ea-d689-468f-a612-b87c3e63077d-kube-api-access-w9np5" (OuterVolumeSpecName: "kube-api-access-w9np5") pod "cb0fc8ea-d689-468f-a612-b87c3e63077d" (UID: "cb0fc8ea-d689-468f-a612-b87c3e63077d"). InnerVolumeSpecName "kube-api-access-w9np5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.391592 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-config-data" (OuterVolumeSpecName: "config-data") pod "cb0fc8ea-d689-468f-a612-b87c3e63077d" (UID: "cb0fc8ea-d689-468f-a612-b87c3e63077d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.405402 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb0fc8ea-d689-468f-a612-b87c3e63077d" (UID: "cb0fc8ea-d689-468f-a612-b87c3e63077d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.461404 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.461431 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0fc8ea-d689-468f-a612-b87c3e63077d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.461442 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9np5\" (UniqueName: \"kubernetes.io/projected/cb0fc8ea-d689-468f-a612-b87c3e63077d-kube-api-access-w9np5\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.526818 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d0ffce6-6c23-4d04-a029-6322d065ff24" path="/var/lib/kubelet/pods/1d0ffce6-6c23-4d04-a029-6322d065ff24/volumes" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.527582 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53af86cc-b0d0-4ba5-9294-4aaa6cef6c09" path="/var/lib/kubelet/pods/53af86cc-b0d0-4ba5-9294-4aaa6cef6c09/volumes" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.528095 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7e72169-9bdf-4b8f-9f8d-ca2a994287da" path="/var/lib/kubelet/pods/f7e72169-9bdf-4b8f-9f8d-ca2a994287da/volumes" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.793077 4795 generic.go:334] "Generic (PLEG): container finished" podID="cb0fc8ea-d689-468f-a612-b87c3e63077d" containerID="59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314" exitCode=0 Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.793612 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cb0fc8ea-d689-468f-a612-b87c3e63077d","Type":"ContainerDied","Data":"59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314"} Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.793648 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"cb0fc8ea-d689-468f-a612-b87c3e63077d","Type":"ContainerDied","Data":"0e3ce7752bf63997ed6557e690f1dee58f948c11df612377dc82c2d8fde569b2"} Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.793670 4795 scope.go:117] "RemoveContainer" containerID="59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.793829 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.818652 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.826458 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.844848 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:00:05 crc kubenswrapper[4795]: E0219 23:00:05.845850 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb0fc8ea-d689-468f-a612-b87c3e63077d" containerName="nova-cell1-conductor-conductor" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.852262 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb0fc8ea-d689-468f-a612-b87c3e63077d" containerName="nova-cell1-conductor-conductor" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.852674 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb0fc8ea-d689-468f-a612-b87c3e63077d" containerName="nova-cell1-conductor-conductor" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.853200 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.853316 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.836487 4795 scope.go:117] "RemoveContainer" containerID="59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314" Feb 19 23:00:05 crc kubenswrapper[4795]: E0219 23:00:05.854442 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314\": container with ID starting with 59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314 not found: ID does not exist" containerID="59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.854469 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314"} err="failed to get container status \"59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314\": rpc error: code = NotFound desc = could not find container \"59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314\": container with ID starting with 59c0008df42ff2bf7402d8fd2a516b276c718f55c84fb6bb76f4c9c73d8bd314 not found: ID does not exist" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.858953 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.861330 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.927563 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.976859 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnl2g\" (UniqueName: \"kubernetes.io/projected/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-kube-api-access-qnl2g\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.976924 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:05 crc kubenswrapper[4795]: I0219 23:00:05.976977 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.078453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.078639 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnl2g\" (UniqueName: \"kubernetes.io/projected/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-kube-api-access-qnl2g\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.078708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.086205 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.088790 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.095343 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnl2g\" (UniqueName: \"kubernetes.io/projected/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-kube-api-access-qnl2g\") pod \"nova-cell1-conductor-0\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.117153 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.69:8775/\": read tcp 10.217.0.2:35782->10.217.1.69:8775: read: connection reset by peer" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.117250 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.69:8775/\": read tcp 10.217.0.2:35772->10.217.1.69:8775: read: connection reset by peer" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.126598 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": read tcp 10.217.0.2:47930->10.217.1.71:8774: read: connection reset by peer" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.126675 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": read tcp 10.217.0.2:47942->10.217.1.71:8774: read: connection reset by peer" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.318140 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.594971 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.678218 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.687673 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4khb7\" (UniqueName: \"kubernetes.io/projected/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-kube-api-access-4khb7\") pod \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.687727 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-logs\") pod \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.687763 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-config-data\") pod \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.687842 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-combined-ca-bundle\") pod \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\" (UID: \"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a\") " Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.689673 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-logs" (OuterVolumeSpecName: "logs") pod "8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" (UID: "8c7ab995-e9bc-4bc6-913f-6ceb0efa562a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.695282 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-kube-api-access-4khb7" (OuterVolumeSpecName: "kube-api-access-4khb7") pod "8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" (UID: "8c7ab995-e9bc-4bc6-913f-6ceb0efa562a"). InnerVolumeSpecName "kube-api-access-4khb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.721514 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" (UID: "8c7ab995-e9bc-4bc6-913f-6ceb0efa562a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.757667 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-config-data" (OuterVolumeSpecName: "config-data") pod "8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" (UID: "8c7ab995-e9bc-4bc6-913f-6ceb0efa562a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.789044 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-config-data\") pod \"30b4399f-d971-4131-8243-3c45e8353cdd\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.789433 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-combined-ca-bundle\") pod \"30b4399f-d971-4131-8243-3c45e8353cdd\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.789597 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnff6\" (UniqueName: \"kubernetes.io/projected/30b4399f-d971-4131-8243-3c45e8353cdd-kube-api-access-qnff6\") pod \"30b4399f-d971-4131-8243-3c45e8353cdd\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.789742 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30b4399f-d971-4131-8243-3c45e8353cdd-logs\") pod \"30b4399f-d971-4131-8243-3c45e8353cdd\" (UID: \"30b4399f-d971-4131-8243-3c45e8353cdd\") " Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.790299 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4khb7\" (UniqueName: \"kubernetes.io/projected/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-kube-api-access-4khb7\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.792203 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.794158 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.794309 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.793273 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30b4399f-d971-4131-8243-3c45e8353cdd-logs" (OuterVolumeSpecName: "logs") pod "30b4399f-d971-4131-8243-3c45e8353cdd" (UID: "30b4399f-d971-4131-8243-3c45e8353cdd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.796294 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b4399f-d971-4131-8243-3c45e8353cdd-kube-api-access-qnff6" (OuterVolumeSpecName: "kube-api-access-qnff6") pod "30b4399f-d971-4131-8243-3c45e8353cdd" (UID: "30b4399f-d971-4131-8243-3c45e8353cdd"). InnerVolumeSpecName "kube-api-access-qnff6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.809289 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-bd886b897-t5785" podUID="25125096-9221-46cf-9c10-21242922dc39" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.73:5353: i/o timeout" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.820490 4795 generic.go:334] "Generic (PLEG): container finished" podID="9839fd0b-0161-4772-bda3-ddc2914d7e83" containerID="23d5580181fbc629a99f1fcb391f9c4c8a3cc847543b9789039eab2e110e9ef3" exitCode=0 Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.820543 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9839fd0b-0161-4772-bda3-ddc2914d7e83","Type":"ContainerDied","Data":"23d5580181fbc629a99f1fcb391f9c4c8a3cc847543b9789039eab2e110e9ef3"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.821752 4795 generic.go:334] "Generic (PLEG): container finished" podID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerID="a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6" exitCode=0 Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.821789 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a","Type":"ContainerDied","Data":"a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.821806 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c7ab995-e9bc-4bc6-913f-6ceb0efa562a","Type":"ContainerDied","Data":"6c69b56affcd6e3319b891be2d4392924607b5fdbd45a93af3cf16129ab8f1e4"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.821821 4795 scope.go:117] "RemoveContainer" containerID="a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.821932 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.822421 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-config-data" (OuterVolumeSpecName: "config-data") pod "30b4399f-d971-4131-8243-3c45e8353cdd" (UID: "30b4399f-d971-4131-8243-3c45e8353cdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.829989 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2eb28a2e-eb12-4867-9c26-3416349cc1cc","Type":"ContainerStarted","Data":"1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.830027 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2eb28a2e-eb12-4867-9c26-3416349cc1cc","Type":"ContainerStarted","Data":"93528f93304bf625c4781fad623cd7f9e1b6953a05d729d20b96627d846cf536"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.830870 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30b4399f-d971-4131-8243-3c45e8353cdd" (UID: "30b4399f-d971-4131-8243-3c45e8353cdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.832769 4795 generic.go:334] "Generic (PLEG): container finished" podID="30b4399f-d971-4131-8243-3c45e8353cdd" containerID="6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f" exitCode=0 Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.833118 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.833129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30b4399f-d971-4131-8243-3c45e8353cdd","Type":"ContainerDied","Data":"6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.833413 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30b4399f-d971-4131-8243-3c45e8353cdd","Type":"ContainerDied","Data":"d7a27d1be53c0ad8661e73984444d15fa7b73baf2b5131526b22e61e61977caf"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.834596 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a96a8189-2b04-4ce7-908b-3544dc3b7ec4","Type":"ContainerStarted","Data":"4082a6cff09d82d8edbb880cbdfe3a90c43570e40c46f1c1fef04e5446058702"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.834634 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a96a8189-2b04-4ce7-908b-3544dc3b7ec4","Type":"ContainerStarted","Data":"6c4d2b266d4e96be58f9a61951fa9e543bbdb3cc7c893afe0899c3de4f75ba7d"} Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.843617 4795 scope.go:117] "RemoveContainer" containerID="3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.855841 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.855820261 podStartE2EDuration="2.855820261s" podCreationTimestamp="2026-02-19 23:00:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:06.851595334 +0000 UTC m=+5518.044113218" watchObservedRunningTime="2026-02-19 23:00:06.855820261 +0000 UTC m=+5518.048338125" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.861479 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.877913 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.8778956730000003 podStartE2EDuration="2.877895673s" podCreationTimestamp="2026-02-19 23:00:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:06.87738212 +0000 UTC m=+5518.069899984" watchObservedRunningTime="2026-02-19 23:00:06.877895673 +0000 UTC m=+5518.070413537" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.896908 4795 scope.go:117] "RemoveContainer" containerID="a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.897997 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.898034 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnff6\" (UniqueName: \"kubernetes.io/projected/30b4399f-d971-4131-8243-3c45e8353cdd-kube-api-access-qnff6\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.898046 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30b4399f-d971-4131-8243-3c45e8353cdd-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.898058 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30b4399f-d971-4131-8243-3c45e8353cdd-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:06 crc kubenswrapper[4795]: E0219 23:00:06.906786 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6\": container with ID starting with a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6 not found: ID does not exist" containerID="a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.906821 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6"} err="failed to get container status \"a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6\": rpc error: code = NotFound desc = could not find container \"a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6\": container with ID starting with a8b613dee124277cee481b45a71a4e627213532ed99ccd578d83629d833ffbc6 not found: ID does not exist" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.906845 4795 scope.go:117] "RemoveContainer" containerID="3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e" Feb 19 23:00:06 crc kubenswrapper[4795]: E0219 23:00:06.914312 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e\": container with ID starting with 3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e not found: ID does not exist" containerID="3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.914367 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e"} err="failed to get container status \"3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e\": rpc error: code = NotFound desc = could not find container \"3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e\": container with ID starting with 3e5f71386ccfa754a08ae46bc61bdb80afae6d8cdf0cf2814359a3071d8fb32e not found: ID does not exist" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.914395 4795 scope.go:117] "RemoveContainer" containerID="6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.922300 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.969082 4795 scope.go:117] "RemoveContainer" containerID="9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff" Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.978101 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:00:06 crc kubenswrapper[4795]: I0219 23:00:06.999250 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-combined-ca-bundle\") pod \"9839fd0b-0161-4772-bda3-ddc2914d7e83\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:06.999340 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l54h5\" (UniqueName: \"kubernetes.io/projected/9839fd0b-0161-4772-bda3-ddc2914d7e83-kube-api-access-l54h5\") pod \"9839fd0b-0161-4772-bda3-ddc2914d7e83\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.004546 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-config-data\") pod \"9839fd0b-0161-4772-bda3-ddc2914d7e83\" (UID: \"9839fd0b-0161-4772-bda3-ddc2914d7e83\") " Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.008907 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9839fd0b-0161-4772-bda3-ddc2914d7e83-kube-api-access-l54h5" (OuterVolumeSpecName: "kube-api-access-l54h5") pod "9839fd0b-0161-4772-bda3-ddc2914d7e83" (UID: "9839fd0b-0161-4772-bda3-ddc2914d7e83"). InnerVolumeSpecName "kube-api-access-l54h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018283 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:00:07 crc kubenswrapper[4795]: E0219 23:00:07.018669 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9839fd0b-0161-4772-bda3-ddc2914d7e83" containerName="nova-cell0-conductor-conductor" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018681 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9839fd0b-0161-4772-bda3-ddc2914d7e83" containerName="nova-cell0-conductor-conductor" Feb 19 23:00:07 crc kubenswrapper[4795]: E0219 23:00:07.018698 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-log" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018706 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-log" Feb 19 23:00:07 crc kubenswrapper[4795]: E0219 23:00:07.018731 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-log" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018738 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-log" Feb 19 23:00:07 crc kubenswrapper[4795]: E0219 23:00:07.018754 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-metadata" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018761 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-metadata" Feb 19 23:00:07 crc kubenswrapper[4795]: E0219 23:00:07.018779 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-api" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018785 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-api" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018937 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-log" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018950 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9839fd0b-0161-4772-bda3-ddc2914d7e83" containerName="nova-cell0-conductor-conductor" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018960 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-api" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018975 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" containerName="nova-metadata-metadata" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.018983 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" containerName="nova-api-log" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.020551 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.023762 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.033409 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9839fd0b-0161-4772-bda3-ddc2914d7e83" (UID: "9839fd0b-0161-4772-bda3-ddc2914d7e83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.036353 4795 scope.go:117] "RemoveContainer" containerID="6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f" Feb 19 23:00:07 crc kubenswrapper[4795]: E0219 23:00:07.036933 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f\": container with ID starting with 6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f not found: ID does not exist" containerID="6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.036959 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f"} err="failed to get container status \"6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f\": rpc error: code = NotFound desc = could not find container \"6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f\": container with ID starting with 6d26cbca7b84b1c79190688e30dc36b02a9a4c86419cdd59664872d8f4e8db9f not found: ID does not exist" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.036977 4795 scope.go:117] "RemoveContainer" containerID="9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff" Feb 19 23:00:07 crc kubenswrapper[4795]: E0219 23:00:07.037443 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff\": container with ID starting with 9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff not found: ID does not exist" containerID="9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.037466 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff"} err="failed to get container status \"9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff\": rpc error: code = NotFound desc = could not find container \"9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff\": container with ID starting with 9f81160ecc73cbe2584f285d4077d91e6e6e7f01ec7eb44fcd8f220d2ecb27ff not found: ID does not exist" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.050319 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.060395 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-config-data" (OuterVolumeSpecName: "config-data") pod "9839fd0b-0161-4772-bda3-ddc2914d7e83" (UID: "9839fd0b-0161-4772-bda3-ddc2914d7e83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.067765 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.074777 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.085873 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.087701 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.090206 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.102389 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.106320 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-config-data\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.106365 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-logs\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.106385 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.106449 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dq6w\" (UniqueName: \"kubernetes.io/projected/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-kube-api-access-2dq6w\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.106557 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l54h5\" (UniqueName: \"kubernetes.io/projected/9839fd0b-0161-4772-bda3-ddc2914d7e83-kube-api-access-l54h5\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.106572 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.106581 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9839fd0b-0161-4772-bda3-ddc2914d7e83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.141324 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212078 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-config-data\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212143 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-logs\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212198 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212223 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-config-data\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212241 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af13c78-4805-4828-980c-45e1defd94c3-logs\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212326 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dq6w\" (UniqueName: \"kubernetes.io/projected/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-kube-api-access-2dq6w\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212400 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2xch\" (UniqueName: \"kubernetes.io/projected/8af13c78-4805-4828-980c-45e1defd94c3-kube-api-access-r2xch\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.212822 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-logs\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.218900 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-config-data\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.232865 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.255480 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dq6w\" (UniqueName: \"kubernetes.io/projected/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-kube-api-access-2dq6w\") pod \"nova-metadata-0\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.316179 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2xch\" (UniqueName: \"kubernetes.io/projected/8af13c78-4805-4828-980c-45e1defd94c3-kube-api-access-r2xch\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.316245 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.316272 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-config-data\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.316288 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af13c78-4805-4828-980c-45e1defd94c3-logs\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.316789 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af13c78-4805-4828-980c-45e1defd94c3-logs\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.324068 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-config-data\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.338654 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.346667 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2xch\" (UniqueName: \"kubernetes.io/projected/8af13c78-4805-4828-980c-45e1defd94c3-kube-api-access-r2xch\") pod \"nova-api-0\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.361613 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.409619 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.547207 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b4399f-d971-4131-8243-3c45e8353cdd" path="/var/lib/kubelet/pods/30b4399f-d971-4131-8243-3c45e8353cdd/volumes" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.547925 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c7ab995-e9bc-4bc6-913f-6ceb0efa562a" path="/var/lib/kubelet/pods/8c7ab995-e9bc-4bc6-913f-6ceb0efa562a/volumes" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.548467 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb0fc8ea-d689-468f-a612-b87c3e63077d" path="/var/lib/kubelet/pods/cb0fc8ea-d689-468f-a612-b87c3e63077d/volumes" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.844348 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f1a9135a-42ef-42ca-880a-f4f5ffd78a13","Type":"ContainerStarted","Data":"47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1"} Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.844585 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f1a9135a-42ef-42ca-880a-f4f5ffd78a13","Type":"ContainerStarted","Data":"2adc08c6dd489704d7eddd8052ac3149a11c14f7992162c2dccd22cfce6e5fe5"} Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.846571 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.850824 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9839fd0b-0161-4772-bda3-ddc2914d7e83","Type":"ContainerDied","Data":"453e3afffd87b59561c01a967d49235af5b257402e1ea25bffbfe37497656af8"} Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.850873 4795 scope.go:117] "RemoveContainer" containerID="23d5580181fbc629a99f1fcb391f9c4c8a3cc847543b9789039eab2e110e9ef3" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.850965 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.870967 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.870944368 podStartE2EDuration="2.870944368s" podCreationTimestamp="2026-02-19 23:00:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:07.865842748 +0000 UTC m=+5519.058360612" watchObservedRunningTime="2026-02-19 23:00:07.870944368 +0000 UTC m=+5519.063462232" Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.899980 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:00:07 crc kubenswrapper[4795]: I0219 23:00:07.916655 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.007079 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.021141 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.022642 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.028551 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.031801 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.043258 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.160392 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mzqk\" (UniqueName: \"kubernetes.io/projected/761a7217-33fa-4d78-8a05-492cbb33f48d-kube-api-access-7mzqk\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.160785 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.160884 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.264300 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mzqk\" (UniqueName: \"kubernetes.io/projected/761a7217-33fa-4d78-8a05-492cbb33f48d-kube-api-access-7mzqk\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.264381 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.264476 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.273134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.277252 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.286597 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mzqk\" (UniqueName: \"kubernetes.io/projected/761a7217-33fa-4d78-8a05-492cbb33f48d-kube-api-access-7mzqk\") pod \"nova-cell0-conductor-0\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.377400 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.819198 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.867936 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9","Type":"ContainerStarted","Data":"fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f"} Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.867987 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9","Type":"ContainerStarted","Data":"aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b"} Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.868001 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9","Type":"ContainerStarted","Data":"df7150bd3c379f19d4f935bb8e348093119703bff34dd1ad6781416721057a60"} Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.874120 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8af13c78-4805-4828-980c-45e1defd94c3","Type":"ContainerStarted","Data":"d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def"} Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.874150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8af13c78-4805-4828-980c-45e1defd94c3","Type":"ContainerStarted","Data":"ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21"} Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.874182 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8af13c78-4805-4828-980c-45e1defd94c3","Type":"ContainerStarted","Data":"7ee07f55b63d65c8ea13f8ca8377dd262c2422aff92a6a2abfe47d6fef72c015"} Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.879070 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"761a7217-33fa-4d78-8a05-492cbb33f48d","Type":"ContainerStarted","Data":"a920c7c53728d52d3ab518fdecf0b6800cb795ab36fabc65145920815940fa68"} Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.895558 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.895539045 podStartE2EDuration="2.895539045s" podCreationTimestamp="2026-02-19 23:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:08.885097209 +0000 UTC m=+5520.077615073" watchObservedRunningTime="2026-02-19 23:00:08.895539045 +0000 UTC m=+5520.088056919" Feb 19 23:00:08 crc kubenswrapper[4795]: I0219 23:00:08.915522 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.915493983 podStartE2EDuration="2.915493983s" podCreationTimestamp="2026-02-19 23:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:08.90161435 +0000 UTC m=+5520.094132214" watchObservedRunningTime="2026-02-19 23:00:08.915493983 +0000 UTC m=+5520.108011877" Feb 19 23:00:09 crc kubenswrapper[4795]: I0219 23:00:09.529614 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9839fd0b-0161-4772-bda3-ddc2914d7e83" path="/var/lib/kubelet/pods/9839fd0b-0161-4772-bda3-ddc2914d7e83/volumes" Feb 19 23:00:09 crc kubenswrapper[4795]: I0219 23:00:09.894793 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"761a7217-33fa-4d78-8a05-492cbb33f48d","Type":"ContainerStarted","Data":"2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5"} Feb 19 23:00:09 crc kubenswrapper[4795]: I0219 23:00:09.895827 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:09 crc kubenswrapper[4795]: I0219 23:00:09.920399 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.9203789479999998 podStartE2EDuration="2.920378948s" podCreationTimestamp="2026-02-19 23:00:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:09.913486022 +0000 UTC m=+5521.106003896" watchObservedRunningTime="2026-02-19 23:00:09.920378948 +0000 UTC m=+5521.112896812" Feb 19 23:00:10 crc kubenswrapper[4795]: I0219 23:00:10.327029 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 23:00:10 crc kubenswrapper[4795]: I0219 23:00:10.333973 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:12 crc kubenswrapper[4795]: I0219 23:00:12.362214 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 23:00:12 crc kubenswrapper[4795]: I0219 23:00:12.362623 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 23:00:15 crc kubenswrapper[4795]: I0219 23:00:15.327673 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 23:00:15 crc kubenswrapper[4795]: I0219 23:00:15.334655 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:15 crc kubenswrapper[4795]: I0219 23:00:15.345580 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:15 crc kubenswrapper[4795]: I0219 23:00:15.355339 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 23:00:15 crc kubenswrapper[4795]: I0219 23:00:15.823704 4795 scope.go:117] "RemoveContainer" containerID="e5dc57de5d860d1b9f4b0c7fa5487f6cf98d60033e436fbbba7b629cc254b689" Feb 19 23:00:15 crc kubenswrapper[4795]: I0219 23:00:15.993709 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 19 23:00:16 crc kubenswrapper[4795]: I0219 23:00:16.045477 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 23:00:16 crc kubenswrapper[4795]: I0219 23:00:16.346773 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 23:00:17 crc kubenswrapper[4795]: I0219 23:00:17.362423 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 23:00:17 crc kubenswrapper[4795]: I0219 23:00:17.362485 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 23:00:17 crc kubenswrapper[4795]: I0219 23:00:17.410308 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 23:00:17 crc kubenswrapper[4795]: I0219 23:00:17.410363 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 23:00:18 crc kubenswrapper[4795]: I0219 23:00:18.407823 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 23:00:18 crc kubenswrapper[4795]: I0219 23:00:18.446498 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:00:18 crc kubenswrapper[4795]: I0219 23:00:18.446686 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:00:18 crc kubenswrapper[4795]: I0219 23:00:18.534499 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.84:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:00:18 crc kubenswrapper[4795]: I0219 23:00:18.534513 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.84:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.595775 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.598411 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.600929 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.611339 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.750402 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.750511 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.750585 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.750675 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq4nx\" (UniqueName: \"kubernetes.io/projected/29196316-7b32-486b-a786-10a3912bc206-kube-api-access-pq4nx\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.750726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29196316-7b32-486b-a786-10a3912bc206-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.750765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-scripts\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.852790 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.852848 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.852879 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.852914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq4nx\" (UniqueName: \"kubernetes.io/projected/29196316-7b32-486b-a786-10a3912bc206-kube-api-access-pq4nx\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.852934 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29196316-7b32-486b-a786-10a3912bc206-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.852953 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-scripts\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.853506 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29196316-7b32-486b-a786-10a3912bc206-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.858567 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-scripts\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.859548 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.859727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.861508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.871905 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq4nx\" (UniqueName: \"kubernetes.io/projected/29196316-7b32-486b-a786-10a3912bc206-kube-api-access-pq4nx\") pod \"cinder-scheduler-0\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:21 crc kubenswrapper[4795]: I0219 23:00:21.967858 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:00:22 crc kubenswrapper[4795]: I0219 23:00:22.475644 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:23 crc kubenswrapper[4795]: I0219 23:00:23.071295 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29196316-7b32-486b-a786-10a3912bc206","Type":"ContainerStarted","Data":"92b489e7d1f8c23a12cd2c3b0232a07d52f20da60c5e7bd656a106e1bbfc254b"} Feb 19 23:00:23 crc kubenswrapper[4795]: I0219 23:00:23.384185 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:00:23 crc kubenswrapper[4795]: I0219 23:00:23.385116 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api-log" containerID="cri-o://aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930" gracePeriod=30 Feb 19 23:00:23 crc kubenswrapper[4795]: I0219 23:00:23.385264 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api" containerID="cri-o://776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7" gracePeriod=30 Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.081998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29196316-7b32-486b-a786-10a3912bc206","Type":"ContainerStarted","Data":"43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9"} Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.082347 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29196316-7b32-486b-a786-10a3912bc206","Type":"ContainerStarted","Data":"d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe"} Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.085320 4795 generic.go:334] "Generic (PLEG): container finished" podID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerID="aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930" exitCode=143 Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.085355 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3922ed4f-baf9-481e-af8b-b009440dfea2","Type":"ContainerDied","Data":"aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930"} Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.100524 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.102426 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.108275 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.125702 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.128724 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.128696538 podStartE2EDuration="3.128696538s" podCreationTimestamp="2026-02-19 23:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:24.108795251 +0000 UTC m=+5535.301313115" watchObservedRunningTime="2026-02-19 23:00:24.128696538 +0000 UTC m=+5535.321214402" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.195584 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.195661 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.195712 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.195737 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.195770 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.195789 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.195810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-dev\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.195834 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.196093 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.196154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-sys\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.196292 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8xj8\" (UniqueName: \"kubernetes.io/projected/90e22321-4464-4199-b873-8998821a02ed-kube-api-access-r8xj8\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.196367 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.196488 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.196632 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/90e22321-4464-4199-b873-8998821a02ed-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.196686 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.196715 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-run\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299408 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299479 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-run\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299610 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299646 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299681 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299703 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299725 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-dev\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299759 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299854 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-sys\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299899 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xj8\" (UniqueName: \"kubernetes.io/projected/90e22321-4464-4199-b873-8998821a02ed-kube-api-access-r8xj8\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299934 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.299984 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.300048 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/90e22321-4464-4199-b873-8998821a02ed-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.301671 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-sys\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.301739 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-dev\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.302042 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.302598 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-run\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.302629 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.302643 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.302662 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.302695 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.302715 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.302752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/90e22321-4464-4199-b873-8998821a02ed-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.307348 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.307727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/90e22321-4464-4199-b873-8998821a02ed-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.307962 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.309264 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.314010 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90e22321-4464-4199-b873-8998821a02ed-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.322024 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8xj8\" (UniqueName: \"kubernetes.io/projected/90e22321-4464-4199-b873-8998821a02ed-kube-api-access-r8xj8\") pod \"cinder-volume-volume1-0\" (UID: \"90e22321-4464-4199-b873-8998821a02ed\") " pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:24 crc kubenswrapper[4795]: I0219 23:00:24.424925 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.014875 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.017022 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.020994 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.031746 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.112380 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125192 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-scripts\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125253 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-config-data\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125268 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-config-data-custom\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125303 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125327 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-sys\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125348 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125382 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-nvme\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/12de80a7-e42b-4768-83d4-0ed7d7490c30-ceph\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-lib-modules\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125443 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-dev\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125466 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125480 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125504 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-run\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-457zc\" (UniqueName: \"kubernetes.io/projected/12de80a7-e42b-4768-83d4-0ed7d7490c30-kube-api-access-457zc\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.125544 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227647 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-scripts\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227689 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-config-data\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227706 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-config-data-custom\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227750 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227780 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-sys\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227840 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-nvme\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227863 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/12de80a7-e42b-4768-83d4-0ed7d7490c30-ceph\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227891 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-lib-modules\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227924 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-dev\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227949 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227968 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.227992 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-run\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.228009 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-457zc\" (UniqueName: \"kubernetes.io/projected/12de80a7-e42b-4768-83d4-0ed7d7490c30-kube-api-access-457zc\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.228495 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.228656 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.228786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-dev\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.228843 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-lib-modules\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.229081 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.229134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.229449 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.229478 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-sys\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.229500 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-run\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.229572 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/12de80a7-e42b-4768-83d4-0ed7d7490c30-etc-nvme\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.236001 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.236377 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-scripts\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.236715 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-config-data\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.237213 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/12de80a7-e42b-4768-83d4-0ed7d7490c30-ceph\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.239611 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12de80a7-e42b-4768-83d4-0ed7d7490c30-config-data-custom\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.263188 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-457zc\" (UniqueName: \"kubernetes.io/projected/12de80a7-e42b-4768-83d4-0ed7d7490c30-kube-api-access-457zc\") pod \"cinder-backup-0\" (UID: \"12de80a7-e42b-4768-83d4-0ed7d7490c30\") " pod="openstack/cinder-backup-0" Feb 19 23:00:25 crc kubenswrapper[4795]: I0219 23:00:25.385704 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.021215 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b9kk9"] Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.023966 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: W0219 23:00:26.041951 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12de80a7_e42b_4768_83d4_0ed7d7490c30.slice/crio-881e7c70e362f07106c7210992c15ba1c9599247aeb6ae56884f8e6be5569570 WatchSource:0}: Error finding container 881e7c70e362f07106c7210992c15ba1c9599247aeb6ae56884f8e6be5569570: Status 404 returned error can't find the container with id 881e7c70e362f07106c7210992c15ba1c9599247aeb6ae56884f8e6be5569570 Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.060006 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.106207 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b9kk9"] Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.116265 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"90e22321-4464-4199-b873-8998821a02ed","Type":"ContainerStarted","Data":"ea7efe1c9e8ab00bffd97273e2e98eee7a5bf08616ffdbda291a2ddac368b77f"} Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.117509 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"12de80a7-e42b-4768-83d4-0ed7d7490c30","Type":"ContainerStarted","Data":"881e7c70e362f07106c7210992c15ba1c9599247aeb6ae56884f8e6be5569570"} Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.147307 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdcxt\" (UniqueName: \"kubernetes.io/projected/020ffbb9-5c2d-4cdd-af08-44c28850c44c-kube-api-access-gdcxt\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.147362 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-catalog-content\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.147919 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-utilities\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.249811 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-utilities\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.250411 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdcxt\" (UniqueName: \"kubernetes.io/projected/020ffbb9-5c2d-4cdd-af08-44c28850c44c-kube-api-access-gdcxt\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.250508 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-catalog-content\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.250428 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-utilities\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.250979 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-catalog-content\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.283541 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdcxt\" (UniqueName: \"kubernetes.io/projected/020ffbb9-5c2d-4cdd-af08-44c28850c44c-kube-api-access-gdcxt\") pod \"redhat-operators-b9kk9\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.345657 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.565814 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.78:8776/healthcheck\": read tcp 10.217.0.2:40670->10.217.1.78:8776: read: connection reset by peer" Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.875459 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b9kk9"] Feb 19 23:00:26 crc kubenswrapper[4795]: I0219 23:00:26.968817 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.066760 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.163572 4795 generic.go:334] "Generic (PLEG): container finished" podID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerID="776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7" exitCode=0 Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.163639 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.163665 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3922ed4f-baf9-481e-af8b-b009440dfea2","Type":"ContainerDied","Data":"776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7"} Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.165934 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3922ed4f-baf9-481e-af8b-b009440dfea2","Type":"ContainerDied","Data":"595aeff6e650976b06c39b4880224d655a58680d4787becedbb202c16d33fdcb"} Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.166042 4795 scope.go:117] "RemoveContainer" containerID="776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.172755 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9kk9" event={"ID":"020ffbb9-5c2d-4cdd-af08-44c28850c44c","Type":"ContainerStarted","Data":"09bb4d42669be15e952593140fb7fd9a24d0d3ef4f5071cd4b5b653e574c2307"} Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.181407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"90e22321-4464-4199-b873-8998821a02ed","Type":"ContainerStarted","Data":"457bbb15ecd6c0b5e367713e9b9154013c2947fef4fac9999aba9df45c607652"} Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.181456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"90e22321-4464-4199-b873-8998821a02ed","Type":"ContainerStarted","Data":"99be7e590f387dd0d638a8fcea28d04ea3256b782be62a9a947c6792d92b775d"} Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.182743 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data-custom\") pod \"3922ed4f-baf9-481e-af8b-b009440dfea2\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.182859 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-combined-ca-bundle\") pod \"3922ed4f-baf9-481e-af8b-b009440dfea2\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.182911 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thkh7\" (UniqueName: \"kubernetes.io/projected/3922ed4f-baf9-481e-af8b-b009440dfea2-kube-api-access-thkh7\") pod \"3922ed4f-baf9-481e-af8b-b009440dfea2\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.182955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data\") pod \"3922ed4f-baf9-481e-af8b-b009440dfea2\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.182978 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-scripts\") pod \"3922ed4f-baf9-481e-af8b-b009440dfea2\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.183039 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3922ed4f-baf9-481e-af8b-b009440dfea2-logs\") pod \"3922ed4f-baf9-481e-af8b-b009440dfea2\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.183058 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3922ed4f-baf9-481e-af8b-b009440dfea2-etc-machine-id\") pod \"3922ed4f-baf9-481e-af8b-b009440dfea2\" (UID: \"3922ed4f-baf9-481e-af8b-b009440dfea2\") " Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.183457 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3922ed4f-baf9-481e-af8b-b009440dfea2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3922ed4f-baf9-481e-af8b-b009440dfea2" (UID: "3922ed4f-baf9-481e-af8b-b009440dfea2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.188424 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3922ed4f-baf9-481e-af8b-b009440dfea2-logs" (OuterVolumeSpecName: "logs") pod "3922ed4f-baf9-481e-af8b-b009440dfea2" (UID: "3922ed4f-baf9-481e-af8b-b009440dfea2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.204305 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3922ed4f-baf9-481e-af8b-b009440dfea2" (UID: "3922ed4f-baf9-481e-af8b-b009440dfea2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.208440 4795 scope.go:117] "RemoveContainer" containerID="aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.216845 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3922ed4f-baf9-481e-af8b-b009440dfea2-kube-api-access-thkh7" (OuterVolumeSpecName: "kube-api-access-thkh7") pod "3922ed4f-baf9-481e-af8b-b009440dfea2" (UID: "3922ed4f-baf9-481e-af8b-b009440dfea2"). InnerVolumeSpecName "kube-api-access-thkh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.217649 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-scripts" (OuterVolumeSpecName: "scripts") pod "3922ed4f-baf9-481e-af8b-b009440dfea2" (UID: "3922ed4f-baf9-481e-af8b-b009440dfea2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.229386 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.496400299 podStartE2EDuration="3.229355514s" podCreationTimestamp="2026-02-19 23:00:24 +0000 UTC" firstStartedPulling="2026-02-19 23:00:25.121030504 +0000 UTC m=+5536.313548368" lastFinishedPulling="2026-02-19 23:00:25.853985719 +0000 UTC m=+5537.046503583" observedRunningTime="2026-02-19 23:00:27.216557799 +0000 UTC m=+5538.409075683" watchObservedRunningTime="2026-02-19 23:00:27.229355514 +0000 UTC m=+5538.421873368" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.242139 4795 scope.go:117] "RemoveContainer" containerID="776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7" Feb 19 23:00:27 crc kubenswrapper[4795]: E0219 23:00:27.243879 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7\": container with ID starting with 776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7 not found: ID does not exist" containerID="776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.243922 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7"} err="failed to get container status \"776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7\": rpc error: code = NotFound desc = could not find container \"776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7\": container with ID starting with 776e2209bad65e1284c6f9e29b06d04a385b79aab9583f4b7309a11a22e8add7 not found: ID does not exist" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.243951 4795 scope.go:117] "RemoveContainer" containerID="aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930" Feb 19 23:00:27 crc kubenswrapper[4795]: E0219 23:00:27.244646 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930\": container with ID starting with aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930 not found: ID does not exist" containerID="aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.244672 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930"} err="failed to get container status \"aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930\": rpc error: code = NotFound desc = could not find container \"aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930\": container with ID starting with aeb4083d1a03c26f7d2a5602b43820b940e6170cf9c38835c9bf0e75e7c3c930 not found: ID does not exist" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.287777 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.289493 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thkh7\" (UniqueName: \"kubernetes.io/projected/3922ed4f-baf9-481e-af8b-b009440dfea2-kube-api-access-thkh7\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.289516 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.289530 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3922ed4f-baf9-481e-af8b-b009440dfea2-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.289545 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3922ed4f-baf9-481e-af8b-b009440dfea2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.288548 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3922ed4f-baf9-481e-af8b-b009440dfea2" (UID: "3922ed4f-baf9-481e-af8b-b009440dfea2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.340858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data" (OuterVolumeSpecName: "config-data") pod "3922ed4f-baf9-481e-af8b-b009440dfea2" (UID: "3922ed4f-baf9-481e-af8b-b009440dfea2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.368704 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.371942 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.386747 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.392700 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.392732 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3922ed4f-baf9-481e-af8b-b009440dfea2-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.426536 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.427549 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.432996 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.439510 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.580345 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.611607 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.627574 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:00:27 crc kubenswrapper[4795]: E0219 23:00:27.628000 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api-log" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.628013 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api-log" Feb 19 23:00:27 crc kubenswrapper[4795]: E0219 23:00:27.628027 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.628033 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.628272 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.628290 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" containerName="cinder-api-log" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.629261 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.636891 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.639933 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.711038 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.711136 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzgqt\" (UniqueName: \"kubernetes.io/projected/daeb9555-6d76-45ca-b3da-b6dd91c33e00-kube-api-access-dzgqt\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.711355 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/daeb9555-6d76-45ca-b3da-b6dd91c33e00-etc-machine-id\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.711445 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daeb9555-6d76-45ca-b3da-b6dd91c33e00-logs\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.711470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-config-data\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.711525 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-config-data-custom\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.711603 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-scripts\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.813258 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-scripts\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.813343 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.813393 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzgqt\" (UniqueName: \"kubernetes.io/projected/daeb9555-6d76-45ca-b3da-b6dd91c33e00-kube-api-access-dzgqt\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.813447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/daeb9555-6d76-45ca-b3da-b6dd91c33e00-etc-machine-id\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.813470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daeb9555-6d76-45ca-b3da-b6dd91c33e00-logs\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.813488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-config-data\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.813517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-config-data-custom\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.813927 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/daeb9555-6d76-45ca-b3da-b6dd91c33e00-etc-machine-id\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.814195 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daeb9555-6d76-45ca-b3da-b6dd91c33e00-logs\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.819737 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-scripts\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.824755 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-config-data-custom\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.826039 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.826846 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daeb9555-6d76-45ca-b3da-b6dd91c33e00-config-data\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.852277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzgqt\" (UniqueName: \"kubernetes.io/projected/daeb9555-6d76-45ca-b3da-b6dd91c33e00-kube-api-access-dzgqt\") pod \"cinder-api-0\" (UID: \"daeb9555-6d76-45ca-b3da-b6dd91c33e00\") " pod="openstack/cinder-api-0" Feb 19 23:00:27 crc kubenswrapper[4795]: I0219 23:00:27.968398 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.222210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"12de80a7-e42b-4768-83d4-0ed7d7490c30","Type":"ContainerStarted","Data":"1d6f3a131c3730f147c75e6cf092cfe1b2fef75915a932a4a830767b2806bbb3"} Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.222599 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"12de80a7-e42b-4768-83d4-0ed7d7490c30","Type":"ContainerStarted","Data":"539f48c9031cb8e3d4c10faf0cffcb2a7d4258bbee19b5ff93167b6f1554ee43"} Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.232001 4795 generic.go:334] "Generic (PLEG): container finished" podID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerID="3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4" exitCode=0 Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.233019 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9kk9" event={"ID":"020ffbb9-5c2d-4cdd-af08-44c28850c44c","Type":"ContainerDied","Data":"3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4"} Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.235702 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.258494 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.260801 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.272710 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.29915732 podStartE2EDuration="4.272693918s" podCreationTimestamp="2026-02-19 23:00:24 +0000 UTC" firstStartedPulling="2026-02-19 23:00:26.049065774 +0000 UTC m=+5537.241583638" lastFinishedPulling="2026-02-19 23:00:27.022602372 +0000 UTC m=+5538.215120236" observedRunningTime="2026-02-19 23:00:28.257558743 +0000 UTC m=+5539.450076607" watchObservedRunningTime="2026-02-19 23:00:28.272693918 +0000 UTC m=+5539.465211782" Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.529305 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.809764 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4vwwf"] Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.815242 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.818523 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4vwwf"] Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.950656 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-catalog-content\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.950779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-utilities\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:28 crc kubenswrapper[4795]: I0219 23:00:28.950832 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvrs6\" (UniqueName: \"kubernetes.io/projected/aa08f457-a2bb-40ae-afe4-647920d80f5d-kube-api-access-zvrs6\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.053250 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-catalog-content\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.053378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-utilities\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.053429 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvrs6\" (UniqueName: \"kubernetes.io/projected/aa08f457-a2bb-40ae-afe4-647920d80f5d-kube-api-access-zvrs6\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.054445 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-catalog-content\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.054743 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-utilities\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.077060 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvrs6\" (UniqueName: \"kubernetes.io/projected/aa08f457-a2bb-40ae-afe4-647920d80f5d-kube-api-access-zvrs6\") pod \"certified-operators-4vwwf\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.147474 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.277200 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"daeb9555-6d76-45ca-b3da-b6dd91c33e00","Type":"ContainerStarted","Data":"e8e06b8be2864d518c9bf61d6a21ee38c0b47470da224147a202924170bba205"} Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.426620 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.546365 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3922ed4f-baf9-481e-af8b-b009440dfea2" path="/var/lib/kubelet/pods/3922ed4f-baf9-481e-af8b-b009440dfea2/volumes" Feb 19 23:00:29 crc kubenswrapper[4795]: I0219 23:00:29.777777 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4vwwf"] Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.286631 4795 generic.go:334] "Generic (PLEG): container finished" podID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerID="d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292" exitCode=0 Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.286738 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vwwf" event={"ID":"aa08f457-a2bb-40ae-afe4-647920d80f5d","Type":"ContainerDied","Data":"d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292"} Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.286984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vwwf" event={"ID":"aa08f457-a2bb-40ae-afe4-647920d80f5d","Type":"ContainerStarted","Data":"a57329754df0615f27cca643d3caf47c4f91f7d152f100af5abc965130234bef"} Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.298445 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"daeb9555-6d76-45ca-b3da-b6dd91c33e00","Type":"ContainerStarted","Data":"5be777487e5586bd93badf6c2c1b00f6b197adc9ea1d9c8e60728a166616e293"} Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.298489 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"daeb9555-6d76-45ca-b3da-b6dd91c33e00","Type":"ContainerStarted","Data":"9c6fa1de60cfdd76be176b9f9175d3f6eee78ce3e820e7f88108d85834a9ef82"} Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.298562 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.301023 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9kk9" event={"ID":"020ffbb9-5c2d-4cdd-af08-44c28850c44c","Type":"ContainerStarted","Data":"cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb"} Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.348074 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.348054119 podStartE2EDuration="3.348054119s" podCreationTimestamp="2026-02-19 23:00:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:30.334237267 +0000 UTC m=+5541.526755131" watchObservedRunningTime="2026-02-19 23:00:30.348054119 +0000 UTC m=+5541.540572013" Feb 19 23:00:30 crc kubenswrapper[4795]: I0219 23:00:30.387780 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Feb 19 23:00:31 crc kubenswrapper[4795]: I0219 23:00:31.326756 4795 generic.go:334] "Generic (PLEG): container finished" podID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerID="cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb" exitCode=0 Feb 19 23:00:31 crc kubenswrapper[4795]: I0219 23:00:31.326898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9kk9" event={"ID":"020ffbb9-5c2d-4cdd-af08-44c28850c44c","Type":"ContainerDied","Data":"cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb"} Feb 19 23:00:31 crc kubenswrapper[4795]: I0219 23:00:31.350834 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vwwf" event={"ID":"aa08f457-a2bb-40ae-afe4-647920d80f5d","Type":"ContainerStarted","Data":"10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac"} Feb 19 23:00:32 crc kubenswrapper[4795]: I0219 23:00:32.157037 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 23:00:32 crc kubenswrapper[4795]: I0219 23:00:32.282909 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:32 crc kubenswrapper[4795]: I0219 23:00:32.363932 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="29196316-7b32-486b-a786-10a3912bc206" containerName="cinder-scheduler" containerID="cri-o://d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe" gracePeriod=30 Feb 19 23:00:32 crc kubenswrapper[4795]: I0219 23:00:32.365065 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9kk9" event={"ID":"020ffbb9-5c2d-4cdd-af08-44c28850c44c","Type":"ContainerStarted","Data":"073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d"} Feb 19 23:00:32 crc kubenswrapper[4795]: I0219 23:00:32.365696 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="29196316-7b32-486b-a786-10a3912bc206" containerName="probe" containerID="cri-o://43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9" gracePeriod=30 Feb 19 23:00:32 crc kubenswrapper[4795]: I0219 23:00:32.411833 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b9kk9" podStartSLOduration=2.95841042 podStartE2EDuration="6.411812604s" podCreationTimestamp="2026-02-19 23:00:26 +0000 UTC" firstStartedPulling="2026-02-19 23:00:28.247048765 +0000 UTC m=+5539.439566629" lastFinishedPulling="2026-02-19 23:00:31.700450949 +0000 UTC m=+5542.892968813" observedRunningTime="2026-02-19 23:00:32.39554836 +0000 UTC m=+5543.588066234" watchObservedRunningTime="2026-02-19 23:00:32.411812604 +0000 UTC m=+5543.604330468" Feb 19 23:00:33 crc kubenswrapper[4795]: I0219 23:00:33.374110 4795 generic.go:334] "Generic (PLEG): container finished" podID="29196316-7b32-486b-a786-10a3912bc206" containerID="43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9" exitCode=0 Feb 19 23:00:33 crc kubenswrapper[4795]: I0219 23:00:33.374157 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29196316-7b32-486b-a786-10a3912bc206","Type":"ContainerDied","Data":"43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9"} Feb 19 23:00:33 crc kubenswrapper[4795]: I0219 23:00:33.377146 4795 generic.go:334] "Generic (PLEG): container finished" podID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerID="10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac" exitCode=0 Feb 19 23:00:33 crc kubenswrapper[4795]: I0219 23:00:33.377186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vwwf" event={"ID":"aa08f457-a2bb-40ae-afe4-647920d80f5d","Type":"ContainerDied","Data":"10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac"} Feb 19 23:00:34 crc kubenswrapper[4795]: I0219 23:00:34.648661 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.152829 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.223388 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-scripts\") pod \"29196316-7b32-486b-a786-10a3912bc206\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.223710 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data\") pod \"29196316-7b32-486b-a786-10a3912bc206\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.223782 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-combined-ca-bundle\") pod \"29196316-7b32-486b-a786-10a3912bc206\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.223835 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq4nx\" (UniqueName: \"kubernetes.io/projected/29196316-7b32-486b-a786-10a3912bc206-kube-api-access-pq4nx\") pod \"29196316-7b32-486b-a786-10a3912bc206\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.223895 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data-custom\") pod \"29196316-7b32-486b-a786-10a3912bc206\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.223921 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29196316-7b32-486b-a786-10a3912bc206-etc-machine-id\") pod \"29196316-7b32-486b-a786-10a3912bc206\" (UID: \"29196316-7b32-486b-a786-10a3912bc206\") " Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.224316 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29196316-7b32-486b-a786-10a3912bc206-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "29196316-7b32-486b-a786-10a3912bc206" (UID: "29196316-7b32-486b-a786-10a3912bc206"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.230821 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "29196316-7b32-486b-a786-10a3912bc206" (UID: "29196316-7b32-486b-a786-10a3912bc206"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.230947 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-scripts" (OuterVolumeSpecName: "scripts") pod "29196316-7b32-486b-a786-10a3912bc206" (UID: "29196316-7b32-486b-a786-10a3912bc206"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.241377 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29196316-7b32-486b-a786-10a3912bc206-kube-api-access-pq4nx" (OuterVolumeSpecName: "kube-api-access-pq4nx") pod "29196316-7b32-486b-a786-10a3912bc206" (UID: "29196316-7b32-486b-a786-10a3912bc206"). InnerVolumeSpecName "kube-api-access-pq4nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.283285 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29196316-7b32-486b-a786-10a3912bc206" (UID: "29196316-7b32-486b-a786-10a3912bc206"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.326321 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.326358 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.326368 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq4nx\" (UniqueName: \"kubernetes.io/projected/29196316-7b32-486b-a786-10a3912bc206-kube-api-access-pq4nx\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.326379 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.326388 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29196316-7b32-486b-a786-10a3912bc206-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.341885 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data" (OuterVolumeSpecName: "config-data") pod "29196316-7b32-486b-a786-10a3912bc206" (UID: "29196316-7b32-486b-a786-10a3912bc206"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.399452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vwwf" event={"ID":"aa08f457-a2bb-40ae-afe4-647920d80f5d","Type":"ContainerStarted","Data":"354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8"} Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.407450 4795 generic.go:334] "Generic (PLEG): container finished" podID="29196316-7b32-486b-a786-10a3912bc206" containerID="d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe" exitCode=0 Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.407499 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29196316-7b32-486b-a786-10a3912bc206","Type":"ContainerDied","Data":"d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe"} Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.407530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"29196316-7b32-486b-a786-10a3912bc206","Type":"ContainerDied","Data":"92b489e7d1f8c23a12cd2c3b0232a07d52f20da60c5e7bd656a106e1bbfc254b"} Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.407548 4795 scope.go:117] "RemoveContainer" containerID="43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.407671 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.428472 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29196316-7b32-486b-a786-10a3912bc206-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.438865 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4vwwf" podStartSLOduration=3.338683081 podStartE2EDuration="7.438845426s" podCreationTimestamp="2026-02-19 23:00:28 +0000 UTC" firstStartedPulling="2026-02-19 23:00:30.289750895 +0000 UTC m=+5541.482268759" lastFinishedPulling="2026-02-19 23:00:34.38991324 +0000 UTC m=+5545.582431104" observedRunningTime="2026-02-19 23:00:35.425368463 +0000 UTC m=+5546.617886327" watchObservedRunningTime="2026-02-19 23:00:35.438845426 +0000 UTC m=+5546.631363290" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.458296 4795 scope.go:117] "RemoveContainer" containerID="d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.462275 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.476194 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.491941 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:35 crc kubenswrapper[4795]: E0219 23:00:35.492377 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29196316-7b32-486b-a786-10a3912bc206" containerName="cinder-scheduler" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.492390 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="29196316-7b32-486b-a786-10a3912bc206" containerName="cinder-scheduler" Feb 19 23:00:35 crc kubenswrapper[4795]: E0219 23:00:35.492414 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29196316-7b32-486b-a786-10a3912bc206" containerName="probe" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.492420 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="29196316-7b32-486b-a786-10a3912bc206" containerName="probe" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.492604 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="29196316-7b32-486b-a786-10a3912bc206" containerName="probe" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.492631 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="29196316-7b32-486b-a786-10a3912bc206" containerName="cinder-scheduler" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.493593 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.497035 4795 scope.go:117] "RemoveContainer" containerID="43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.497257 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 23:00:35 crc kubenswrapper[4795]: E0219 23:00:35.499602 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9\": container with ID starting with 43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9 not found: ID does not exist" containerID="43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.499647 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9"} err="failed to get container status \"43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9\": rpc error: code = NotFound desc = could not find container \"43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9\": container with ID starting with 43556d75e94601761021c0d54884cee46c2545d93705f88358280a86bd6460b9 not found: ID does not exist" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.499675 4795 scope.go:117] "RemoveContainer" containerID="d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.502849 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:35 crc kubenswrapper[4795]: E0219 23:00:35.503554 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe\": container with ID starting with d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe not found: ID does not exist" containerID="d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.503589 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe"} err="failed to get container status \"d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe\": rpc error: code = NotFound desc = could not find container \"d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe\": container with ID starting with d9346b789eea6533075dc5c63a660bdc3b57347c9d222a08726141055f09a5fe not found: ID does not exist" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.526473 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29196316-7b32-486b-a786-10a3912bc206" path="/var/lib/kubelet/pods/29196316-7b32-486b-a786-10a3912bc206/volumes" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.529824 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.529945 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-config-data\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.530049 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.530119 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-scripts\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.530324 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xddk9\" (UniqueName: \"kubernetes.io/projected/85502c41-99ab-4a8f-9c36-f4d839b931a1-kube-api-access-xddk9\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.530400 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85502c41-99ab-4a8f-9c36-f4d839b931a1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.632925 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85502c41-99ab-4a8f-9c36-f4d839b931a1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.633100 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.633194 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-config-data\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.633259 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.633283 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-scripts\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.633334 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xddk9\" (UniqueName: \"kubernetes.io/projected/85502c41-99ab-4a8f-9c36-f4d839b931a1-kube-api-access-xddk9\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.633521 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85502c41-99ab-4a8f-9c36-f4d839b931a1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.636985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.637763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-config-data\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.642676 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-scripts\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.653730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85502c41-99ab-4a8f-9c36-f4d839b931a1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.654892 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xddk9\" (UniqueName: \"kubernetes.io/projected/85502c41-99ab-4a8f-9c36-f4d839b931a1-kube-api-access-xddk9\") pod \"cinder-scheduler-0\" (UID: \"85502c41-99ab-4a8f-9c36-f4d839b931a1\") " pod="openstack/cinder-scheduler-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.661425 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Feb 19 23:00:35 crc kubenswrapper[4795]: I0219 23:00:35.811600 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 23:00:36 crc kubenswrapper[4795]: I0219 23:00:36.346543 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:36 crc kubenswrapper[4795]: I0219 23:00:36.346906 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:36 crc kubenswrapper[4795]: I0219 23:00:36.363944 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 23:00:36 crc kubenswrapper[4795]: I0219 23:00:36.417140 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"85502c41-99ab-4a8f-9c36-f4d839b931a1","Type":"ContainerStarted","Data":"97453bbb3480978fa58406787fe8a850f1853c6b62b3df6b6fa05ef1fc9640f1"} Feb 19 23:00:37 crc kubenswrapper[4795]: I0219 23:00:37.395037 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b9kk9" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="registry-server" probeResult="failure" output=< Feb 19 23:00:37 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:00:37 crc kubenswrapper[4795]: > Feb 19 23:00:37 crc kubenswrapper[4795]: I0219 23:00:37.428752 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"85502c41-99ab-4a8f-9c36-f4d839b931a1","Type":"ContainerStarted","Data":"b590684afc64442bc6607340dcf33bb82f3583beec7f72c755dc77fedc311fdc"} Feb 19 23:00:38 crc kubenswrapper[4795]: I0219 23:00:38.440207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"85502c41-99ab-4a8f-9c36-f4d839b931a1","Type":"ContainerStarted","Data":"b01c5ad1bb4374b6ffb7df6128eb4d4719ee00325fb78af74d7e54b6006036b2"} Feb 19 23:00:38 crc kubenswrapper[4795]: I0219 23:00:38.475110 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.475089992 podStartE2EDuration="3.475089992s" podCreationTimestamp="2026-02-19 23:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:00:38.46676016 +0000 UTC m=+5549.659278044" watchObservedRunningTime="2026-02-19 23:00:38.475089992 +0000 UTC m=+5549.667607856" Feb 19 23:00:39 crc kubenswrapper[4795]: I0219 23:00:39.147628 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:39 crc kubenswrapper[4795]: I0219 23:00:39.148534 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:39 crc kubenswrapper[4795]: I0219 23:00:39.198129 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:39 crc kubenswrapper[4795]: I0219 23:00:39.495559 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:39 crc kubenswrapper[4795]: I0219 23:00:39.594802 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4vwwf"] Feb 19 23:00:39 crc kubenswrapper[4795]: I0219 23:00:39.773369 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 23:00:40 crc kubenswrapper[4795]: I0219 23:00:40.813320 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 23:00:41 crc kubenswrapper[4795]: I0219 23:00:41.464895 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4vwwf" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerName="registry-server" containerID="cri-o://354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8" gracePeriod=2 Feb 19 23:00:41 crc kubenswrapper[4795]: I0219 23:00:41.958911 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.068441 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-utilities\") pod \"aa08f457-a2bb-40ae-afe4-647920d80f5d\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.069464 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-utilities" (OuterVolumeSpecName: "utilities") pod "aa08f457-a2bb-40ae-afe4-647920d80f5d" (UID: "aa08f457-a2bb-40ae-afe4-647920d80f5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.069579 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-catalog-content\") pod \"aa08f457-a2bb-40ae-afe4-647920d80f5d\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.069860 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvrs6\" (UniqueName: \"kubernetes.io/projected/aa08f457-a2bb-40ae-afe4-647920d80f5d-kube-api-access-zvrs6\") pod \"aa08f457-a2bb-40ae-afe4-647920d80f5d\" (UID: \"aa08f457-a2bb-40ae-afe4-647920d80f5d\") " Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.070942 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.077189 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa08f457-a2bb-40ae-afe4-647920d80f5d-kube-api-access-zvrs6" (OuterVolumeSpecName: "kube-api-access-zvrs6") pod "aa08f457-a2bb-40ae-afe4-647920d80f5d" (UID: "aa08f457-a2bb-40ae-afe4-647920d80f5d"). InnerVolumeSpecName "kube-api-access-zvrs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.116614 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa08f457-a2bb-40ae-afe4-647920d80f5d" (UID: "aa08f457-a2bb-40ae-afe4-647920d80f5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.173007 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa08f457-a2bb-40ae-afe4-647920d80f5d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.173038 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvrs6\" (UniqueName: \"kubernetes.io/projected/aa08f457-a2bb-40ae-afe4-647920d80f5d-kube-api-access-zvrs6\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.491737 4795 generic.go:334] "Generic (PLEG): container finished" podID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerID="354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8" exitCode=0 Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.491831 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vwwf" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.491840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vwwf" event={"ID":"aa08f457-a2bb-40ae-afe4-647920d80f5d","Type":"ContainerDied","Data":"354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8"} Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.492611 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vwwf" event={"ID":"aa08f457-a2bb-40ae-afe4-647920d80f5d","Type":"ContainerDied","Data":"a57329754df0615f27cca643d3caf47c4f91f7d152f100af5abc965130234bef"} Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.492648 4795 scope.go:117] "RemoveContainer" containerID="354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.531230 4795 scope.go:117] "RemoveContainer" containerID="10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.550803 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4vwwf"] Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.559475 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4vwwf"] Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.585618 4795 scope.go:117] "RemoveContainer" containerID="d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.624095 4795 scope.go:117] "RemoveContainer" containerID="354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8" Feb 19 23:00:42 crc kubenswrapper[4795]: E0219 23:00:42.624655 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8\": container with ID starting with 354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8 not found: ID does not exist" containerID="354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.624724 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8"} err="failed to get container status \"354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8\": rpc error: code = NotFound desc = could not find container \"354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8\": container with ID starting with 354b28fba47cef5561d9493add6a54eacdc83cf2022b7003621a76bafc4e36e8 not found: ID does not exist" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.624763 4795 scope.go:117] "RemoveContainer" containerID="10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac" Feb 19 23:00:42 crc kubenswrapper[4795]: E0219 23:00:42.625240 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac\": container with ID starting with 10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac not found: ID does not exist" containerID="10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.625335 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac"} err="failed to get container status \"10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac\": rpc error: code = NotFound desc = could not find container \"10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac\": container with ID starting with 10212642ebf7b9d732af19b3e74e14cc1893a2a12c26a1fddef9354851ecf3ac not found: ID does not exist" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.625414 4795 scope.go:117] "RemoveContainer" containerID="d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292" Feb 19 23:00:42 crc kubenswrapper[4795]: E0219 23:00:42.625854 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292\": container with ID starting with d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292 not found: ID does not exist" containerID="d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292" Feb 19 23:00:42 crc kubenswrapper[4795]: I0219 23:00:42.625897 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292"} err="failed to get container status \"d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292\": rpc error: code = NotFound desc = could not find container \"d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292\": container with ID starting with d94bf9a18a9e2fe63761c24c1da9448bde51def8159c0a24a7add8d212314292 not found: ID does not exist" Feb 19 23:00:43 crc kubenswrapper[4795]: I0219 23:00:43.531469 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" path="/var/lib/kubelet/pods/aa08f457-a2bb-40ae-afe4-647920d80f5d/volumes" Feb 19 23:00:46 crc kubenswrapper[4795]: I0219 23:00:46.058766 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 23:00:46 crc kubenswrapper[4795]: I0219 23:00:46.393020 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:46 crc kubenswrapper[4795]: I0219 23:00:46.434860 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:46 crc kubenswrapper[4795]: I0219 23:00:46.631408 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b9kk9"] Feb 19 23:00:47 crc kubenswrapper[4795]: I0219 23:00:47.533688 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b9kk9" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="registry-server" containerID="cri-o://073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d" gracePeriod=2 Feb 19 23:00:47 crc kubenswrapper[4795]: I0219 23:00:47.947280 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.097101 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-catalog-content\") pod \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.097535 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdcxt\" (UniqueName: \"kubernetes.io/projected/020ffbb9-5c2d-4cdd-af08-44c28850c44c-kube-api-access-gdcxt\") pod \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.097662 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-utilities\") pod \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\" (UID: \"020ffbb9-5c2d-4cdd-af08-44c28850c44c\") " Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.098027 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-utilities" (OuterVolumeSpecName: "utilities") pod "020ffbb9-5c2d-4cdd-af08-44c28850c44c" (UID: "020ffbb9-5c2d-4cdd-af08-44c28850c44c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.098306 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.102033 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020ffbb9-5c2d-4cdd-af08-44c28850c44c-kube-api-access-gdcxt" (OuterVolumeSpecName: "kube-api-access-gdcxt") pod "020ffbb9-5c2d-4cdd-af08-44c28850c44c" (UID: "020ffbb9-5c2d-4cdd-af08-44c28850c44c"). InnerVolumeSpecName "kube-api-access-gdcxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.200303 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdcxt\" (UniqueName: \"kubernetes.io/projected/020ffbb9-5c2d-4cdd-af08-44c28850c44c-kube-api-access-gdcxt\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.216104 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "020ffbb9-5c2d-4cdd-af08-44c28850c44c" (UID: "020ffbb9-5c2d-4cdd-af08-44c28850c44c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.301990 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020ffbb9-5c2d-4cdd-af08-44c28850c44c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.545696 4795 generic.go:334] "Generic (PLEG): container finished" podID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerID="073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d" exitCode=0 Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.545742 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9kk9" event={"ID":"020ffbb9-5c2d-4cdd-af08-44c28850c44c","Type":"ContainerDied","Data":"073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d"} Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.545776 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9kk9" event={"ID":"020ffbb9-5c2d-4cdd-af08-44c28850c44c","Type":"ContainerDied","Data":"09bb4d42669be15e952593140fb7fd9a24d0d3ef4f5071cd4b5b653e574c2307"} Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.545797 4795 scope.go:117] "RemoveContainer" containerID="073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.545947 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9kk9" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.577087 4795 scope.go:117] "RemoveContainer" containerID="cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.620265 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b9kk9"] Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.629046 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b9kk9"] Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.629673 4795 scope.go:117] "RemoveContainer" containerID="3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.658250 4795 scope.go:117] "RemoveContainer" containerID="073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d" Feb 19 23:00:48 crc kubenswrapper[4795]: E0219 23:00:48.658670 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d\": container with ID starting with 073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d not found: ID does not exist" containerID="073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.658724 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d"} err="failed to get container status \"073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d\": rpc error: code = NotFound desc = could not find container \"073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d\": container with ID starting with 073422a5416bd29e48cb4ea48f28db816baf075d9c482ced2680d3a6e1225d6d not found: ID does not exist" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.658757 4795 scope.go:117] "RemoveContainer" containerID="cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb" Feb 19 23:00:48 crc kubenswrapper[4795]: E0219 23:00:48.659203 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb\": container with ID starting with cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb not found: ID does not exist" containerID="cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.659247 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb"} err="failed to get container status \"cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb\": rpc error: code = NotFound desc = could not find container \"cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb\": container with ID starting with cac79134c9314d800f04bfb3cf858eecee6be20b351d06d3a975449071b3d8fb not found: ID does not exist" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.659275 4795 scope.go:117] "RemoveContainer" containerID="3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4" Feb 19 23:00:48 crc kubenswrapper[4795]: E0219 23:00:48.659589 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4\": container with ID starting with 3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4 not found: ID does not exist" containerID="3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4" Feb 19 23:00:48 crc kubenswrapper[4795]: I0219 23:00:48.659641 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4"} err="failed to get container status \"3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4\": rpc error: code = NotFound desc = could not find container \"3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4\": container with ID starting with 3604e5b9a639bd35d71757f720a9b88617fedfa662eef3b2876130119bd577e4 not found: ID does not exist" Feb 19 23:00:49 crc kubenswrapper[4795]: I0219 23:00:49.532859 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" path="/var/lib/kubelet/pods/020ffbb9-5c2d-4cdd-af08-44c28850c44c/volumes" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.161583 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29525701-nmz5v"] Feb 19 23:01:00 crc kubenswrapper[4795]: E0219 23:01:00.162668 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerName="extract-utilities" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.162690 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerName="extract-utilities" Feb 19 23:01:00 crc kubenswrapper[4795]: E0219 23:01:00.162713 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerName="extract-content" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.162725 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerName="extract-content" Feb 19 23:01:00 crc kubenswrapper[4795]: E0219 23:01:00.162761 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="registry-server" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.162770 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="registry-server" Feb 19 23:01:00 crc kubenswrapper[4795]: E0219 23:01:00.162791 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="extract-content" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.162801 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="extract-content" Feb 19 23:01:00 crc kubenswrapper[4795]: E0219 23:01:00.162821 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="extract-utilities" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.162831 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="extract-utilities" Feb 19 23:01:00 crc kubenswrapper[4795]: E0219 23:01:00.162847 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerName="registry-server" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.162855 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerName="registry-server" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.163060 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="020ffbb9-5c2d-4cdd-af08-44c28850c44c" containerName="registry-server" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.163118 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa08f457-a2bb-40ae-afe4-647920d80f5d" containerName="registry-server" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.163994 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.181674 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525701-nmz5v"] Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.348735 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-config-data\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.349555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwnlc\" (UniqueName: \"kubernetes.io/projected/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-kube-api-access-mwnlc\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.349625 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-combined-ca-bundle\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.349791 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-fernet-keys\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.452019 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-config-data\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.452114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwnlc\" (UniqueName: \"kubernetes.io/projected/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-kube-api-access-mwnlc\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.452134 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-combined-ca-bundle\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.452155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-fernet-keys\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.458751 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-config-data\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.458956 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-fernet-keys\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.460373 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-combined-ca-bundle\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.469361 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwnlc\" (UniqueName: \"kubernetes.io/projected/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-kube-api-access-mwnlc\") pod \"keystone-cron-29525701-nmz5v\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.488476 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:00 crc kubenswrapper[4795]: I0219 23:01:00.938607 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525701-nmz5v"] Feb 19 23:01:00 crc kubenswrapper[4795]: W0219 23:01:00.945091 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f2d7932_b11f_4e9b_a6e0_2a9a069a3459.slice/crio-9e80301705c26ed37e88db528f388e314a92c46e04a198b9f0f7512654d0d045 WatchSource:0}: Error finding container 9e80301705c26ed37e88db528f388e314a92c46e04a198b9f0f7512654d0d045: Status 404 returned error can't find the container with id 9e80301705c26ed37e88db528f388e314a92c46e04a198b9f0f7512654d0d045 Feb 19 23:01:01 crc kubenswrapper[4795]: I0219 23:01:01.725869 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525701-nmz5v" event={"ID":"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459","Type":"ContainerStarted","Data":"928018fdfd9df98dcda0e154657ded64328052c445bfd14bf81fd4a1c8a9f74b"} Feb 19 23:01:01 crc kubenswrapper[4795]: I0219 23:01:01.726243 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525701-nmz5v" event={"ID":"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459","Type":"ContainerStarted","Data":"9e80301705c26ed37e88db528f388e314a92c46e04a198b9f0f7512654d0d045"} Feb 19 23:01:01 crc kubenswrapper[4795]: I0219 23:01:01.753085 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29525701-nmz5v" podStartSLOduration=1.7530656169999999 podStartE2EDuration="1.753065617s" podCreationTimestamp="2026-02-19 23:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:01:01.745644868 +0000 UTC m=+5572.938162732" watchObservedRunningTime="2026-02-19 23:01:01.753065617 +0000 UTC m=+5572.945583481" Feb 19 23:01:03 crc kubenswrapper[4795]: I0219 23:01:03.755074 4795 generic.go:334] "Generic (PLEG): container finished" podID="5f2d7932-b11f-4e9b-a6e0-2a9a069a3459" containerID="928018fdfd9df98dcda0e154657ded64328052c445bfd14bf81fd4a1c8a9f74b" exitCode=0 Feb 19 23:01:03 crc kubenswrapper[4795]: I0219 23:01:03.755255 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525701-nmz5v" event={"ID":"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459","Type":"ContainerDied","Data":"928018fdfd9df98dcda0e154657ded64328052c445bfd14bf81fd4a1c8a9f74b"} Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.092308 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.241583 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-fernet-keys\") pod \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.241690 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-config-data\") pod \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.241839 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-combined-ca-bundle\") pod \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.241871 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwnlc\" (UniqueName: \"kubernetes.io/projected/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-kube-api-access-mwnlc\") pod \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\" (UID: \"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459\") " Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.247152 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5f2d7932-b11f-4e9b-a6e0-2a9a069a3459" (UID: "5f2d7932-b11f-4e9b-a6e0-2a9a069a3459"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.247529 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-kube-api-access-mwnlc" (OuterVolumeSpecName: "kube-api-access-mwnlc") pod "5f2d7932-b11f-4e9b-a6e0-2a9a069a3459" (UID: "5f2d7932-b11f-4e9b-a6e0-2a9a069a3459"). InnerVolumeSpecName "kube-api-access-mwnlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.278253 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f2d7932-b11f-4e9b-a6e0-2a9a069a3459" (UID: "5f2d7932-b11f-4e9b-a6e0-2a9a069a3459"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.289179 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-config-data" (OuterVolumeSpecName: "config-data") pod "5f2d7932-b11f-4e9b-a6e0-2a9a069a3459" (UID: "5f2d7932-b11f-4e9b-a6e0-2a9a069a3459"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.343423 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.343449 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwnlc\" (UniqueName: \"kubernetes.io/projected/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-kube-api-access-mwnlc\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.343459 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.343469 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2d7932-b11f-4e9b-a6e0-2a9a069a3459-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.779550 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525701-nmz5v" event={"ID":"5f2d7932-b11f-4e9b-a6e0-2a9a069a3459","Type":"ContainerDied","Data":"9e80301705c26ed37e88db528f388e314a92c46e04a198b9f0f7512654d0d045"} Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.779946 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e80301705c26ed37e88db528f388e314a92c46e04a198b9f0f7512654d0d045" Feb 19 23:01:05 crc kubenswrapper[4795]: I0219 23:01:05.780215 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525701-nmz5v" Feb 19 23:01:16 crc kubenswrapper[4795]: I0219 23:01:16.042270 4795 scope.go:117] "RemoveContainer" containerID="da78b2ce5ad3acba003da6948c0acb827331e556f914cb3178cd5862028563a8" Feb 19 23:01:16 crc kubenswrapper[4795]: I0219 23:01:16.075605 4795 scope.go:117] "RemoveContainer" containerID="be2f6f4f8f1c84b00fee1744df7613e5b2cb7c534a7ea1884cb10ab8da51e6b9" Feb 19 23:01:28 crc kubenswrapper[4795]: I0219 23:01:28.427945 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:01:28 crc kubenswrapper[4795]: I0219 23:01:28.428807 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:01:58 crc kubenswrapper[4795]: I0219 23:01:58.427439 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:01:58 crc kubenswrapper[4795]: I0219 23:01:58.429097 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:02:22 crc kubenswrapper[4795]: I0219 23:02:22.064446 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-pgc5v"] Feb 19 23:02:22 crc kubenswrapper[4795]: I0219 23:02:22.072843 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-af99-account-create-update-5rdfd"] Feb 19 23:02:22 crc kubenswrapper[4795]: I0219 23:02:22.081832 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-af99-account-create-update-5rdfd"] Feb 19 23:02:22 crc kubenswrapper[4795]: I0219 23:02:22.089202 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-pgc5v"] Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.525671 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="036fd6f7-0c88-4c92-9a98-0a774124c8fd" path="/var/lib/kubelet/pods/036fd6f7-0c88-4c92-9a98-0a774124c8fd/volumes" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.526829 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1e0382a-40d3-42e1-93d3-e5098af1e54f" path="/var/lib/kubelet/pods/d1e0382a-40d3-42e1-93d3-e5098af1e54f/volumes" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.535471 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-knqfl"] Feb 19 23:02:23 crc kubenswrapper[4795]: E0219 23:02:23.535947 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2d7932-b11f-4e9b-a6e0-2a9a069a3459" containerName="keystone-cron" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.535968 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2d7932-b11f-4e9b-a6e0-2a9a069a3459" containerName="keystone-cron" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.536298 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f2d7932-b11f-4e9b-a6e0-2a9a069a3459" containerName="keystone-cron" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.537021 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.541779 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.542265 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-zbt4l" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.549372 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-lrv52"] Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.567311 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-knqfl"] Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.567762 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.574519 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-lrv52"] Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.602714 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-scripts\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.604404 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-log-ovn\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.604652 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx4xt\" (UniqueName: \"kubernetes.io/projected/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-kube-api-access-rx4xt\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.604837 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-run\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.605047 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-run-ovn\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.708538 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-scripts\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.708789 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-run\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.708877 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-log-ovn\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.710369 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx4xt\" (UniqueName: \"kubernetes.io/projected/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-kube-api-access-rx4xt\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.710780 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-log-ovn\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.711668 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e98c62c-20fc-462c-9973-2616cb184032-scripts\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.711837 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-run\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.711960 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-etc-ovs\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.712049 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-log\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.712275 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-run-ovn\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.712432 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-lib\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.712510 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v7wq\" (UniqueName: \"kubernetes.io/projected/9e98c62c-20fc-462c-9973-2616cb184032-kube-api-access-4v7wq\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.712755 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-run\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.712877 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-var-run-ovn\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.713349 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-scripts\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.729928 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx4xt\" (UniqueName: \"kubernetes.io/projected/3bbc323f-3f18-42bc-b0d8-12f021d91d6b-kube-api-access-rx4xt\") pod \"ovn-controller-knqfl\" (UID: \"3bbc323f-3f18-42bc-b0d8-12f021d91d6b\") " pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.813775 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-etc-ovs\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.813823 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-log\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.813887 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v7wq\" (UniqueName: \"kubernetes.io/projected/9e98c62c-20fc-462c-9973-2616cb184032-kube-api-access-4v7wq\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.813908 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-lib\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.813942 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-etc-ovs\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.813948 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-run\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.813988 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-run\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.814031 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-log\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.814037 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e98c62c-20fc-462c-9973-2616cb184032-scripts\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.814320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9e98c62c-20fc-462c-9973-2616cb184032-var-lib\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.815912 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e98c62c-20fc-462c-9973-2616cb184032-scripts\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.829234 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v7wq\" (UniqueName: \"kubernetes.io/projected/9e98c62c-20fc-462c-9973-2616cb184032-kube-api-access-4v7wq\") pod \"ovn-controller-ovs-lrv52\" (UID: \"9e98c62c-20fc-462c-9973-2616cb184032\") " pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.889718 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-knqfl" Feb 19 23:02:23 crc kubenswrapper[4795]: I0219 23:02:23.907545 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:24 crc kubenswrapper[4795]: I0219 23:02:24.370302 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-knqfl"] Feb 19 23:02:24 crc kubenswrapper[4795]: I0219 23:02:24.780128 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-lrv52"] Feb 19 23:02:24 crc kubenswrapper[4795]: W0219 23:02:24.781600 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e98c62c_20fc_462c_9973_2616cb184032.slice/crio-cf7f93d39e5b96fef8397c2e3eb29d0e53274c9b6760f32bae9d3cedc9e2c9ec WatchSource:0}: Error finding container cf7f93d39e5b96fef8397c2e3eb29d0e53274c9b6760f32bae9d3cedc9e2c9ec: Status 404 returned error can't find the container with id cf7f93d39e5b96fef8397c2e3eb29d0e53274c9b6760f32bae9d3cedc9e2c9ec Feb 19 23:02:24 crc kubenswrapper[4795]: I0219 23:02:24.783661 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-knqfl" event={"ID":"3bbc323f-3f18-42bc-b0d8-12f021d91d6b","Type":"ContainerStarted","Data":"b63d30a586da308b8a8d09e24fdb92d0880e58235af42ed511eb9acb18ec4616"} Feb 19 23:02:24 crc kubenswrapper[4795]: I0219 23:02:24.783695 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-knqfl" event={"ID":"3bbc323f-3f18-42bc-b0d8-12f021d91d6b","Type":"ContainerStarted","Data":"a2192e4b328ebbe549f9817abd6e79e991147dad6ae8a53ca760a3d199231af1"} Feb 19 23:02:24 crc kubenswrapper[4795]: I0219 23:02:24.784361 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-knqfl" Feb 19 23:02:24 crc kubenswrapper[4795]: I0219 23:02:24.808298 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-knqfl" podStartSLOduration=1.808274002 podStartE2EDuration="1.808274002s" podCreationTimestamp="2026-02-19 23:02:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:02:24.798290888 +0000 UTC m=+5655.990808752" watchObservedRunningTime="2026-02-19 23:02:24.808274002 +0000 UTC m=+5656.000791876" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.068281 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-kx9qd"] Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.070055 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.074857 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.088247 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kx9qd"] Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.146331 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b48804d5-a275-45dd-896c-f35b7a322690-ovn-rundir\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.146481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b48804d5-a275-45dd-896c-f35b7a322690-ovs-rundir\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.146538 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgpd8\" (UniqueName: \"kubernetes.io/projected/b48804d5-a275-45dd-896c-f35b7a322690-kube-api-access-wgpd8\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.146654 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b48804d5-a275-45dd-896c-f35b7a322690-config\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.248642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b48804d5-a275-45dd-896c-f35b7a322690-ovn-rundir\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.248772 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b48804d5-a275-45dd-896c-f35b7a322690-ovs-rundir\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.248802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgpd8\" (UniqueName: \"kubernetes.io/projected/b48804d5-a275-45dd-896c-f35b7a322690-kube-api-access-wgpd8\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.248865 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b48804d5-a275-45dd-896c-f35b7a322690-config\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.248972 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b48804d5-a275-45dd-896c-f35b7a322690-ovs-rundir\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.248969 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b48804d5-a275-45dd-896c-f35b7a322690-ovn-rundir\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.249743 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b48804d5-a275-45dd-896c-f35b7a322690-config\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.281189 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgpd8\" (UniqueName: \"kubernetes.io/projected/b48804d5-a275-45dd-896c-f35b7a322690-kube-api-access-wgpd8\") pod \"ovn-controller-metrics-kx9qd\" (UID: \"b48804d5-a275-45dd-896c-f35b7a322690\") " pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.397192 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kx9qd" Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.792392 4795 generic.go:334] "Generic (PLEG): container finished" podID="9e98c62c-20fc-462c-9973-2616cb184032" containerID="b00e9c8b111bd9cfeb5dfdb129b307c5979b745ca4ba7e8292aa3f29a3405232" exitCode=0 Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.792448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lrv52" event={"ID":"9e98c62c-20fc-462c-9973-2616cb184032","Type":"ContainerDied","Data":"b00e9c8b111bd9cfeb5dfdb129b307c5979b745ca4ba7e8292aa3f29a3405232"} Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.792697 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lrv52" event={"ID":"9e98c62c-20fc-462c-9973-2616cb184032","Type":"ContainerStarted","Data":"cf7f93d39e5b96fef8397c2e3eb29d0e53274c9b6760f32bae9d3cedc9e2c9ec"} Feb 19 23:02:25 crc kubenswrapper[4795]: I0219 23:02:25.854258 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kx9qd"] Feb 19 23:02:26 crc kubenswrapper[4795]: I0219 23:02:26.802527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kx9qd" event={"ID":"b48804d5-a275-45dd-896c-f35b7a322690","Type":"ContainerStarted","Data":"e8ae1b3ad036414afe3416f20a1f10c4b5931e9c43dc1afd4e11101790124a0d"} Feb 19 23:02:26 crc kubenswrapper[4795]: I0219 23:02:26.803204 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kx9qd" event={"ID":"b48804d5-a275-45dd-896c-f35b7a322690","Type":"ContainerStarted","Data":"cc310fc1cee67a993981338a5f14330a9d8f36fcd9ccbe97e51c5a20c1afb110"} Feb 19 23:02:26 crc kubenswrapper[4795]: I0219 23:02:26.812452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lrv52" event={"ID":"9e98c62c-20fc-462c-9973-2616cb184032","Type":"ContainerStarted","Data":"8975fba2cfc268c57f22092b674d4659416031c54a1449b009e4d0297e7c9dbb"} Feb 19 23:02:26 crc kubenswrapper[4795]: I0219 23:02:26.812511 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lrv52" event={"ID":"9e98c62c-20fc-462c-9973-2616cb184032","Type":"ContainerStarted","Data":"ae4b9c53934ac248ca2b3d73eac1cbc515c2e9bbb8a2d212f4a6d383268f1547"} Feb 19 23:02:26 crc kubenswrapper[4795]: I0219 23:02:26.812787 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:26 crc kubenswrapper[4795]: I0219 23:02:26.812850 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:26 crc kubenswrapper[4795]: I0219 23:02:26.826989 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-kx9qd" podStartSLOduration=2.826973335 podStartE2EDuration="2.826973335s" podCreationTimestamp="2026-02-19 23:02:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:02:26.820124101 +0000 UTC m=+5658.012641965" watchObservedRunningTime="2026-02-19 23:02:26.826973335 +0000 UTC m=+5658.019491199" Feb 19 23:02:26 crc kubenswrapper[4795]: I0219 23:02:26.858888 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-lrv52" podStartSLOduration=3.8587940400000003 podStartE2EDuration="3.85879404s" podCreationTimestamp="2026-02-19 23:02:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:02:26.846953903 +0000 UTC m=+5658.039471777" watchObservedRunningTime="2026-02-19 23:02:26.85879404 +0000 UTC m=+5658.051311904" Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.427357 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.427727 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.427793 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.428655 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.428716 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" gracePeriod=600 Feb 19 23:02:28 crc kubenswrapper[4795]: E0219 23:02:28.549268 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.838709 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" exitCode=0 Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.838770 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c"} Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.838822 4795 scope.go:117] "RemoveContainer" containerID="fd820b5d0adc1705d78ec76939670a71a79ca07206c8d0459e23712d0b015f16" Feb 19 23:02:28 crc kubenswrapper[4795]: I0219 23:02:28.839708 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:02:28 crc kubenswrapper[4795]: E0219 23:02:28.840227 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:02:29 crc kubenswrapper[4795]: I0219 23:02:29.043488 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-bm6ln"] Feb 19 23:02:29 crc kubenswrapper[4795]: I0219 23:02:29.056516 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-bm6ln"] Feb 19 23:02:29 crc kubenswrapper[4795]: I0219 23:02:29.524482 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96625ae6-8eb0-43d0-a180-20c79dfd6717" path="/var/lib/kubelet/pods/96625ae6-8eb0-43d0-a180-20c79dfd6717/volumes" Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.322798 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-rfntz"] Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.325655 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.338988 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-rfntz"] Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.386901 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7389820e-b641-4068-b624-af539a234699-operator-scripts\") pod \"octavia-db-create-rfntz\" (UID: \"7389820e-b641-4068-b624-af539a234699\") " pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.386971 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twfsc\" (UniqueName: \"kubernetes.io/projected/7389820e-b641-4068-b624-af539a234699-kube-api-access-twfsc\") pod \"octavia-db-create-rfntz\" (UID: \"7389820e-b641-4068-b624-af539a234699\") " pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.488696 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twfsc\" (UniqueName: \"kubernetes.io/projected/7389820e-b641-4068-b624-af539a234699-kube-api-access-twfsc\") pod \"octavia-db-create-rfntz\" (UID: \"7389820e-b641-4068-b624-af539a234699\") " pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.488919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7389820e-b641-4068-b624-af539a234699-operator-scripts\") pod \"octavia-db-create-rfntz\" (UID: \"7389820e-b641-4068-b624-af539a234699\") " pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.489947 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7389820e-b641-4068-b624-af539a234699-operator-scripts\") pod \"octavia-db-create-rfntz\" (UID: \"7389820e-b641-4068-b624-af539a234699\") " pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.510424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twfsc\" (UniqueName: \"kubernetes.io/projected/7389820e-b641-4068-b624-af539a234699-kube-api-access-twfsc\") pod \"octavia-db-create-rfntz\" (UID: \"7389820e-b641-4068-b624-af539a234699\") " pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:39 crc kubenswrapper[4795]: I0219 23:02:39.652848 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.123776 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-rfntz"] Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.577194 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-c873-account-create-update-fggql"] Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.578520 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.581108 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.611446 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb4qp\" (UniqueName: \"kubernetes.io/projected/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-kube-api-access-qb4qp\") pod \"octavia-c873-account-create-update-fggql\" (UID: \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\") " pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.611687 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-operator-scripts\") pod \"octavia-c873-account-create-update-fggql\" (UID: \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\") " pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.642931 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-c873-account-create-update-fggql"] Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.713294 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb4qp\" (UniqueName: \"kubernetes.io/projected/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-kube-api-access-qb4qp\") pod \"octavia-c873-account-create-update-fggql\" (UID: \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\") " pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.713422 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-operator-scripts\") pod \"octavia-c873-account-create-update-fggql\" (UID: \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\") " pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.714520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-operator-scripts\") pod \"octavia-c873-account-create-update-fggql\" (UID: \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\") " pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.737468 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb4qp\" (UniqueName: \"kubernetes.io/projected/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-kube-api-access-qb4qp\") pod \"octavia-c873-account-create-update-fggql\" (UID: \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\") " pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.894132 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.950858 4795 generic.go:334] "Generic (PLEG): container finished" podID="7389820e-b641-4068-b624-af539a234699" containerID="461ac9425c34a3821048eb55409a0a70365c0acacbfa307ea4409068b90afe68" exitCode=0 Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.950901 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-rfntz" event={"ID":"7389820e-b641-4068-b624-af539a234699","Type":"ContainerDied","Data":"461ac9425c34a3821048eb55409a0a70365c0acacbfa307ea4409068b90afe68"} Feb 19 23:02:40 crc kubenswrapper[4795]: I0219 23:02:40.950924 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-rfntz" event={"ID":"7389820e-b641-4068-b624-af539a234699","Type":"ContainerStarted","Data":"2d1715a61d0ec12b7dcc1eff75ded916d2307e4020ffa4bcd927cf7a2bbbe7c6"} Feb 19 23:02:41 crc kubenswrapper[4795]: I0219 23:02:41.349654 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-c873-account-create-update-fggql"] Feb 19 23:02:41 crc kubenswrapper[4795]: I0219 23:02:41.967097 4795 generic.go:334] "Generic (PLEG): container finished" podID="9ca0a783-4d18-4d0a-81d8-7cc1970379a9" containerID="fe6c050cae9125e8669040d524bc8951c45b9effbc5921d0ee07f69c88d514a2" exitCode=0 Feb 19 23:02:41 crc kubenswrapper[4795]: I0219 23:02:41.967299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c873-account-create-update-fggql" event={"ID":"9ca0a783-4d18-4d0a-81d8-7cc1970379a9","Type":"ContainerDied","Data":"fe6c050cae9125e8669040d524bc8951c45b9effbc5921d0ee07f69c88d514a2"} Feb 19 23:02:41 crc kubenswrapper[4795]: I0219 23:02:41.967325 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c873-account-create-update-fggql" event={"ID":"9ca0a783-4d18-4d0a-81d8-7cc1970379a9","Type":"ContainerStarted","Data":"9f0e85eba5f4b32dbdff81ed3ccaef5e2ea7cb994e8d4f47c422be26c462d6b7"} Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.064024 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-tghht"] Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.070973 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-tghht"] Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.398006 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.511532 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:02:42 crc kubenswrapper[4795]: E0219 23:02:42.511848 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.564870 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7389820e-b641-4068-b624-af539a234699-operator-scripts\") pod \"7389820e-b641-4068-b624-af539a234699\" (UID: \"7389820e-b641-4068-b624-af539a234699\") " Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.565065 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twfsc\" (UniqueName: \"kubernetes.io/projected/7389820e-b641-4068-b624-af539a234699-kube-api-access-twfsc\") pod \"7389820e-b641-4068-b624-af539a234699\" (UID: \"7389820e-b641-4068-b624-af539a234699\") " Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.565651 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7389820e-b641-4068-b624-af539a234699-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7389820e-b641-4068-b624-af539a234699" (UID: "7389820e-b641-4068-b624-af539a234699"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.566458 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7389820e-b641-4068-b624-af539a234699-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.578334 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7389820e-b641-4068-b624-af539a234699-kube-api-access-twfsc" (OuterVolumeSpecName: "kube-api-access-twfsc") pod "7389820e-b641-4068-b624-af539a234699" (UID: "7389820e-b641-4068-b624-af539a234699"). InnerVolumeSpecName "kube-api-access-twfsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.668211 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twfsc\" (UniqueName: \"kubernetes.io/projected/7389820e-b641-4068-b624-af539a234699-kube-api-access-twfsc\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.980784 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-rfntz" Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.980816 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-rfntz" event={"ID":"7389820e-b641-4068-b624-af539a234699","Type":"ContainerDied","Data":"2d1715a61d0ec12b7dcc1eff75ded916d2307e4020ffa4bcd927cf7a2bbbe7c6"} Feb 19 23:02:42 crc kubenswrapper[4795]: I0219 23:02:42.980891 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d1715a61d0ec12b7dcc1eff75ded916d2307e4020ffa4bcd927cf7a2bbbe7c6" Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.397311 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.494904 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb4qp\" (UniqueName: \"kubernetes.io/projected/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-kube-api-access-qb4qp\") pod \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\" (UID: \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\") " Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.494965 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-operator-scripts\") pod \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\" (UID: \"9ca0a783-4d18-4d0a-81d8-7cc1970379a9\") " Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.496367 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ca0a783-4d18-4d0a-81d8-7cc1970379a9" (UID: "9ca0a783-4d18-4d0a-81d8-7cc1970379a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.500370 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-kube-api-access-qb4qp" (OuterVolumeSpecName: "kube-api-access-qb4qp") pod "9ca0a783-4d18-4d0a-81d8-7cc1970379a9" (UID: "9ca0a783-4d18-4d0a-81d8-7cc1970379a9"). InnerVolumeSpecName "kube-api-access-qb4qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.527296 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe89b6c7-308b-42a8-92a9-da093d6bbae4" path="/var/lib/kubelet/pods/fe89b6c7-308b-42a8-92a9-da093d6bbae4/volumes" Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.596906 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb4qp\" (UniqueName: \"kubernetes.io/projected/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-kube-api-access-qb4qp\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.596966 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ca0a783-4d18-4d0a-81d8-7cc1970379a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.989427 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c873-account-create-update-fggql" event={"ID":"9ca0a783-4d18-4d0a-81d8-7cc1970379a9","Type":"ContainerDied","Data":"9f0e85eba5f4b32dbdff81ed3ccaef5e2ea7cb994e8d4f47c422be26c462d6b7"} Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.989467 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f0e85eba5f4b32dbdff81ed3ccaef5e2ea7cb994e8d4f47c422be26c462d6b7" Feb 19 23:02:43 crc kubenswrapper[4795]: I0219 23:02:43.989470 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c873-account-create-update-fggql" Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.795313 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-2vlg7"] Feb 19 23:02:45 crc kubenswrapper[4795]: E0219 23:02:45.796096 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ca0a783-4d18-4d0a-81d8-7cc1970379a9" containerName="mariadb-account-create-update" Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.796150 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ca0a783-4d18-4d0a-81d8-7cc1970379a9" containerName="mariadb-account-create-update" Feb 19 23:02:45 crc kubenswrapper[4795]: E0219 23:02:45.796191 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7389820e-b641-4068-b624-af539a234699" containerName="mariadb-database-create" Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.796201 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7389820e-b641-4068-b624-af539a234699" containerName="mariadb-database-create" Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.796468 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ca0a783-4d18-4d0a-81d8-7cc1970379a9" containerName="mariadb-account-create-update" Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.796496 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7389820e-b641-4068-b624-af539a234699" containerName="mariadb-database-create" Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.797227 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.830210 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-2vlg7"] Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.933906 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsnmq\" (UniqueName: \"kubernetes.io/projected/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-kube-api-access-nsnmq\") pod \"octavia-persistence-db-create-2vlg7\" (UID: \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\") " pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:45 crc kubenswrapper[4795]: I0219 23:02:45.934066 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-operator-scripts\") pod \"octavia-persistence-db-create-2vlg7\" (UID: \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\") " pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.035872 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsnmq\" (UniqueName: \"kubernetes.io/projected/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-kube-api-access-nsnmq\") pod \"octavia-persistence-db-create-2vlg7\" (UID: \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\") " pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.036097 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-operator-scripts\") pod \"octavia-persistence-db-create-2vlg7\" (UID: \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\") " pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.037444 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-operator-scripts\") pod \"octavia-persistence-db-create-2vlg7\" (UID: \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\") " pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.058825 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsnmq\" (UniqueName: \"kubernetes.io/projected/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-kube-api-access-nsnmq\") pod \"octavia-persistence-db-create-2vlg7\" (UID: \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\") " pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.119316 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.292995 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-75e1-account-create-update-mq672"] Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.294764 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.299240 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.321286 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-75e1-account-create-update-mq672"] Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.444040 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtbjb\" (UniqueName: \"kubernetes.io/projected/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-kube-api-access-xtbjb\") pod \"octavia-75e1-account-create-update-mq672\" (UID: \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\") " pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.444379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-operator-scripts\") pod \"octavia-75e1-account-create-update-mq672\" (UID: \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\") " pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.546073 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtbjb\" (UniqueName: \"kubernetes.io/projected/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-kube-api-access-xtbjb\") pod \"octavia-75e1-account-create-update-mq672\" (UID: \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\") " pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.546117 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-operator-scripts\") pod \"octavia-75e1-account-create-update-mq672\" (UID: \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\") " pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.546844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-operator-scripts\") pod \"octavia-75e1-account-create-update-mq672\" (UID: \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\") " pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.563650 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtbjb\" (UniqueName: \"kubernetes.io/projected/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-kube-api-access-xtbjb\") pod \"octavia-75e1-account-create-update-mq672\" (UID: \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\") " pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.604790 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-2vlg7"] Feb 19 23:02:46 crc kubenswrapper[4795]: W0219 23:02:46.608088 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2af5a019_2aa4_449d_a1a5_148cbf8a1ffa.slice/crio-d39a053ef75b8e4887773a29b0af9d375a1f9f156f0f9ab43541043f16242fbc WatchSource:0}: Error finding container d39a053ef75b8e4887773a29b0af9d375a1f9f156f0f9ab43541043f16242fbc: Status 404 returned error can't find the container with id d39a053ef75b8e4887773a29b0af9d375a1f9f156f0f9ab43541043f16242fbc Feb 19 23:02:46 crc kubenswrapper[4795]: I0219 23:02:46.621682 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:47 crc kubenswrapper[4795]: I0219 23:02:47.014472 4795 generic.go:334] "Generic (PLEG): container finished" podID="2af5a019-2aa4-449d-a1a5-148cbf8a1ffa" containerID="922d8fbba822aa06530b36452ecdfe6cce93b9221523be2934bb7556f81619e3" exitCode=0 Feb 19 23:02:47 crc kubenswrapper[4795]: I0219 23:02:47.014752 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-2vlg7" event={"ID":"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa","Type":"ContainerDied","Data":"922d8fbba822aa06530b36452ecdfe6cce93b9221523be2934bb7556f81619e3"} Feb 19 23:02:47 crc kubenswrapper[4795]: I0219 23:02:47.014774 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-2vlg7" event={"ID":"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa","Type":"ContainerStarted","Data":"d39a053ef75b8e4887773a29b0af9d375a1f9f156f0f9ab43541043f16242fbc"} Feb 19 23:02:47 crc kubenswrapper[4795]: W0219 23:02:47.103359 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ea8f33a_4a23_4058_a6f1_ccd27d64f1f2.slice/crio-b32dc138dfb0dab9a6c37caccc92e45f9ae9b912808aef3bb5c8f22c54256865 WatchSource:0}: Error finding container b32dc138dfb0dab9a6c37caccc92e45f9ae9b912808aef3bb5c8f22c54256865: Status 404 returned error can't find the container with id b32dc138dfb0dab9a6c37caccc92e45f9ae9b912808aef3bb5c8f22c54256865 Feb 19 23:02:47 crc kubenswrapper[4795]: I0219 23:02:47.103501 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-75e1-account-create-update-mq672"] Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.023413 4795 generic.go:334] "Generic (PLEG): container finished" podID="0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2" containerID="8783e1276005ef506acf0881371a237a591f18c217cbbd901f218637b5c95d2c" exitCode=0 Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.023453 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-75e1-account-create-update-mq672" event={"ID":"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2","Type":"ContainerDied","Data":"8783e1276005ef506acf0881371a237a591f18c217cbbd901f218637b5c95d2c"} Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.023836 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-75e1-account-create-update-mq672" event={"ID":"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2","Type":"ContainerStarted","Data":"b32dc138dfb0dab9a6c37caccc92e45f9ae9b912808aef3bb5c8f22c54256865"} Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.386660 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.582412 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsnmq\" (UniqueName: \"kubernetes.io/projected/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-kube-api-access-nsnmq\") pod \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\" (UID: \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\") " Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.582632 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-operator-scripts\") pod \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\" (UID: \"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa\") " Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.583369 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2af5a019-2aa4-449d-a1a5-148cbf8a1ffa" (UID: "2af5a019-2aa4-449d-a1a5-148cbf8a1ffa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.588561 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-kube-api-access-nsnmq" (OuterVolumeSpecName: "kube-api-access-nsnmq") pod "2af5a019-2aa4-449d-a1a5-148cbf8a1ffa" (UID: "2af5a019-2aa4-449d-a1a5-148cbf8a1ffa"). InnerVolumeSpecName "kube-api-access-nsnmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.686712 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsnmq\" (UniqueName: \"kubernetes.io/projected/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-kube-api-access-nsnmq\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:48 crc kubenswrapper[4795]: I0219 23:02:48.686766 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.037462 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-2vlg7" event={"ID":"2af5a019-2aa4-449d-a1a5-148cbf8a1ffa","Type":"ContainerDied","Data":"d39a053ef75b8e4887773a29b0af9d375a1f9f156f0f9ab43541043f16242fbc"} Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.037532 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-2vlg7" Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.037540 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d39a053ef75b8e4887773a29b0af9d375a1f9f156f0f9ab43541043f16242fbc" Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.498002 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.504690 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtbjb\" (UniqueName: \"kubernetes.io/projected/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-kube-api-access-xtbjb\") pod \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\" (UID: \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\") " Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.504829 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-operator-scripts\") pod \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\" (UID: \"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2\") " Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.505335 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2" (UID: "0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.506012 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.510238 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-kube-api-access-xtbjb" (OuterVolumeSpecName: "kube-api-access-xtbjb") pod "0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2" (UID: "0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2"). InnerVolumeSpecName "kube-api-access-xtbjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:02:49 crc kubenswrapper[4795]: I0219 23:02:49.607391 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtbjb\" (UniqueName: \"kubernetes.io/projected/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2-kube-api-access-xtbjb\") on node \"crc\" DevicePath \"\"" Feb 19 23:02:50 crc kubenswrapper[4795]: I0219 23:02:50.049671 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-75e1-account-create-update-mq672" event={"ID":"0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2","Type":"ContainerDied","Data":"b32dc138dfb0dab9a6c37caccc92e45f9ae9b912808aef3bb5c8f22c54256865"} Feb 19 23:02:50 crc kubenswrapper[4795]: I0219 23:02:50.049715 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b32dc138dfb0dab9a6c37caccc92e45f9ae9b912808aef3bb5c8f22c54256865" Feb 19 23:02:50 crc kubenswrapper[4795]: I0219 23:02:50.049773 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-75e1-account-create-update-mq672" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.461954 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-556fc55b45-7gxcm"] Feb 19 23:02:52 crc kubenswrapper[4795]: E0219 23:02:52.463065 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af5a019-2aa4-449d-a1a5-148cbf8a1ffa" containerName="mariadb-database-create" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.463091 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af5a019-2aa4-449d-a1a5-148cbf8a1ffa" containerName="mariadb-database-create" Feb 19 23:02:52 crc kubenswrapper[4795]: E0219 23:02:52.463139 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2" containerName="mariadb-account-create-update" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.463152 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2" containerName="mariadb-account-create-update" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.463514 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2" containerName="mariadb-account-create-update" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.463560 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af5a019-2aa4-449d-a1a5-148cbf8a1ffa" containerName="mariadb-database-create" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.465572 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.468082 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.468533 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.468654 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-pcc54" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.469901 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-556fc55b45-7gxcm"] Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.581996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c95e6fdb-6007-4490-9572-a2709f8b7daf-config-data-merged\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.582091 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-combined-ca-bundle\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.582130 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-scripts\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.582328 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-config-data\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.582363 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/c95e6fdb-6007-4490-9572-a2709f8b7daf-octavia-run\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.684039 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-config-data\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.684088 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/c95e6fdb-6007-4490-9572-a2709f8b7daf-octavia-run\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.684132 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c95e6fdb-6007-4490-9572-a2709f8b7daf-config-data-merged\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.684200 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-combined-ca-bundle\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.684226 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-scripts\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.685409 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c95e6fdb-6007-4490-9572-a2709f8b7daf-config-data-merged\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.685572 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/c95e6fdb-6007-4490-9572-a2709f8b7daf-octavia-run\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.690235 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-combined-ca-bundle\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.691070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-config-data\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.692916 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c95e6fdb-6007-4490-9572-a2709f8b7daf-scripts\") pod \"octavia-api-556fc55b45-7gxcm\" (UID: \"c95e6fdb-6007-4490-9572-a2709f8b7daf\") " pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:52 crc kubenswrapper[4795]: I0219 23:02:52.785824 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:02:53 crc kubenswrapper[4795]: I0219 23:02:53.349521 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-556fc55b45-7gxcm"] Feb 19 23:02:54 crc kubenswrapper[4795]: I0219 23:02:54.105022 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-556fc55b45-7gxcm" event={"ID":"c95e6fdb-6007-4490-9572-a2709f8b7daf","Type":"ContainerStarted","Data":"758dffc4eee1d177de011dc64f85b5ca149775230838f4cfd16f8f6799f68752"} Feb 19 23:02:54 crc kubenswrapper[4795]: I0219 23:02:54.511983 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:02:54 crc kubenswrapper[4795]: E0219 23:02:54.512354 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:02:58 crc kubenswrapper[4795]: I0219 23:02:58.953688 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-knqfl" podUID="3bbc323f-3f18-42bc-b0d8-12f021d91d6b" containerName="ovn-controller" probeResult="failure" output=< Feb 19 23:02:58 crc kubenswrapper[4795]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 23:02:58 crc kubenswrapper[4795]: > Feb 19 23:02:58 crc kubenswrapper[4795]: I0219 23:02:58.960848 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:58 crc kubenswrapper[4795]: I0219 23:02:58.964244 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-lrv52" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.094223 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-knqfl-config-qq79f"] Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.097985 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.101529 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.103091 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-scripts\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.103131 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-726pc\" (UniqueName: \"kubernetes.io/projected/d710e8ab-01d7-4137-872b-71a05ac52188-kube-api-access-726pc\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.103175 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.103192 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run-ovn\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.103237 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-log-ovn\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.103291 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-additional-scripts\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.120869 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-knqfl-config-qq79f"] Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.207137 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-log-ovn\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.207285 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-additional-scripts\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.207416 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-scripts\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.207443 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-726pc\" (UniqueName: \"kubernetes.io/projected/d710e8ab-01d7-4137-872b-71a05ac52188-kube-api-access-726pc\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.207484 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.207507 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run-ovn\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.208662 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-log-ovn\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.209145 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.209213 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run-ovn\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.211893 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-additional-scripts\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.213252 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-scripts\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.234687 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-726pc\" (UniqueName: \"kubernetes.io/projected/d710e8ab-01d7-4137-872b-71a05ac52188-kube-api-access-726pc\") pod \"ovn-controller-knqfl-config-qq79f\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:02:59 crc kubenswrapper[4795]: I0219 23:02:59.423719 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:03:01 crc kubenswrapper[4795]: I0219 23:03:01.536138 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-knqfl-config-qq79f"] Feb 19 23:03:02 crc kubenswrapper[4795]: I0219 23:03:02.195456 4795 generic.go:334] "Generic (PLEG): container finished" podID="c95e6fdb-6007-4490-9572-a2709f8b7daf" containerID="69b94ec375777caa88f05328a06535b8d3ebce689a7af410b0f69a119ac02efd" exitCode=0 Feb 19 23:03:02 crc kubenswrapper[4795]: I0219 23:03:02.195521 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-556fc55b45-7gxcm" event={"ID":"c95e6fdb-6007-4490-9572-a2709f8b7daf","Type":"ContainerDied","Data":"69b94ec375777caa88f05328a06535b8d3ebce689a7af410b0f69a119ac02efd"} Feb 19 23:03:02 crc kubenswrapper[4795]: I0219 23:03:02.200233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-knqfl-config-qq79f" event={"ID":"d710e8ab-01d7-4137-872b-71a05ac52188","Type":"ContainerStarted","Data":"2096e22df9aade6dc70570abe24d3dbe208045474ec5189f56a5d1beac2fa739"} Feb 19 23:03:02 crc kubenswrapper[4795]: I0219 23:03:02.200300 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-knqfl-config-qq79f" event={"ID":"d710e8ab-01d7-4137-872b-71a05ac52188","Type":"ContainerStarted","Data":"7a8fcb62e8aafde30d2ad214505dba921e7102800f504cadda4d53ca44430291"} Feb 19 23:03:02 crc kubenswrapper[4795]: I0219 23:03:02.260284 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-knqfl-config-qq79f" podStartSLOduration=3.260262863 podStartE2EDuration="3.260262863s" podCreationTimestamp="2026-02-19 23:02:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:03:02.258453631 +0000 UTC m=+5693.450971505" watchObservedRunningTime="2026-02-19 23:03:02.260262863 +0000 UTC m=+5693.452780737" Feb 19 23:03:03 crc kubenswrapper[4795]: I0219 23:03:03.210254 4795 generic.go:334] "Generic (PLEG): container finished" podID="d710e8ab-01d7-4137-872b-71a05ac52188" containerID="2096e22df9aade6dc70570abe24d3dbe208045474ec5189f56a5d1beac2fa739" exitCode=0 Feb 19 23:03:03 crc kubenswrapper[4795]: I0219 23:03:03.210445 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-knqfl-config-qq79f" event={"ID":"d710e8ab-01d7-4137-872b-71a05ac52188","Type":"ContainerDied","Data":"2096e22df9aade6dc70570abe24d3dbe208045474ec5189f56a5d1beac2fa739"} Feb 19 23:03:03 crc kubenswrapper[4795]: I0219 23:03:03.213126 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-556fc55b45-7gxcm" event={"ID":"c95e6fdb-6007-4490-9572-a2709f8b7daf","Type":"ContainerStarted","Data":"a54f0ee6b3807ca88881dac5abb58702b27c13ccaa0b8c0ff4269b867d42f5d8"} Feb 19 23:03:03 crc kubenswrapper[4795]: I0219 23:03:03.213176 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-556fc55b45-7gxcm" event={"ID":"c95e6fdb-6007-4490-9572-a2709f8b7daf","Type":"ContainerStarted","Data":"ac78542103643657605fe59b2edbcd40f711e2b924559ba9abeea6a866861ecc"} Feb 19 23:03:03 crc kubenswrapper[4795]: I0219 23:03:03.214060 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:03:03 crc kubenswrapper[4795]: I0219 23:03:03.214083 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:03:03 crc kubenswrapper[4795]: I0219 23:03:03.257928 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-556fc55b45-7gxcm" podStartSLOduration=3.523069343 podStartE2EDuration="11.257902747s" podCreationTimestamp="2026-02-19 23:02:52 +0000 UTC" firstStartedPulling="2026-02-19 23:02:53.356058605 +0000 UTC m=+5684.548576469" lastFinishedPulling="2026-02-19 23:03:01.090892009 +0000 UTC m=+5692.283409873" observedRunningTime="2026-02-19 23:03:03.256748484 +0000 UTC m=+5694.449266348" watchObservedRunningTime="2026-02-19 23:03:03.257902747 +0000 UTC m=+5694.450420631" Feb 19 23:03:03 crc kubenswrapper[4795]: I0219 23:03:03.941383 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-knqfl" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.597580 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609224 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run-ovn\") pod \"d710e8ab-01d7-4137-872b-71a05ac52188\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609359 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d710e8ab-01d7-4137-872b-71a05ac52188" (UID: "d710e8ab-01d7-4137-872b-71a05ac52188"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609390 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-scripts\") pod \"d710e8ab-01d7-4137-872b-71a05ac52188\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609539 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run\") pod \"d710e8ab-01d7-4137-872b-71a05ac52188\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609651 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run" (OuterVolumeSpecName: "var-run") pod "d710e8ab-01d7-4137-872b-71a05ac52188" (UID: "d710e8ab-01d7-4137-872b-71a05ac52188"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609671 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-726pc\" (UniqueName: \"kubernetes.io/projected/d710e8ab-01d7-4137-872b-71a05ac52188-kube-api-access-726pc\") pod \"d710e8ab-01d7-4137-872b-71a05ac52188\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609713 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-additional-scripts\") pod \"d710e8ab-01d7-4137-872b-71a05ac52188\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609745 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-log-ovn\") pod \"d710e8ab-01d7-4137-872b-71a05ac52188\" (UID: \"d710e8ab-01d7-4137-872b-71a05ac52188\") " Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.609862 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d710e8ab-01d7-4137-872b-71a05ac52188" (UID: "d710e8ab-01d7-4137-872b-71a05ac52188"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.610212 4795 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.610229 4795 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.610240 4795 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d710e8ab-01d7-4137-872b-71a05ac52188-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.610388 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d710e8ab-01d7-4137-872b-71a05ac52188" (UID: "d710e8ab-01d7-4137-872b-71a05ac52188"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.610663 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-scripts" (OuterVolumeSpecName: "scripts") pod "d710e8ab-01d7-4137-872b-71a05ac52188" (UID: "d710e8ab-01d7-4137-872b-71a05ac52188"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.648126 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d710e8ab-01d7-4137-872b-71a05ac52188-kube-api-access-726pc" (OuterVolumeSpecName: "kube-api-access-726pc") pod "d710e8ab-01d7-4137-872b-71a05ac52188" (UID: "d710e8ab-01d7-4137-872b-71a05ac52188"). InnerVolumeSpecName "kube-api-access-726pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.713257 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-726pc\" (UniqueName: \"kubernetes.io/projected/d710e8ab-01d7-4137-872b-71a05ac52188-kube-api-access-726pc\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.713331 4795 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:04 crc kubenswrapper[4795]: I0219 23:03:04.713344 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d710e8ab-01d7-4137-872b-71a05ac52188-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.230653 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-knqfl-config-qq79f" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.230846 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-knqfl-config-qq79f" event={"ID":"d710e8ab-01d7-4137-872b-71a05ac52188","Type":"ContainerDied","Data":"7a8fcb62e8aafde30d2ad214505dba921e7102800f504cadda4d53ca44430291"} Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.231054 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a8fcb62e8aafde30d2ad214505dba921e7102800f504cadda4d53ca44430291" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.347045 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-gzbmf"] Feb 19 23:03:05 crc kubenswrapper[4795]: E0219 23:03:05.347524 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d710e8ab-01d7-4137-872b-71a05ac52188" containerName="ovn-config" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.347541 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d710e8ab-01d7-4137-872b-71a05ac52188" containerName="ovn-config" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.347714 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d710e8ab-01d7-4137-872b-71a05ac52188" containerName="ovn-config" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.348626 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.352195 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.352366 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.352462 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.356987 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-gzbmf"] Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.430070 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b3ffcac3-ee64-440c-983d-67404e5f47fd-config-data-merged\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.430175 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/b3ffcac3-ee64-440c-983d-67404e5f47fd-hm-ports\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.430228 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ffcac3-ee64-440c-983d-67404e5f47fd-config-data\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.430251 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ffcac3-ee64-440c-983d-67404e5f47fd-scripts\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.530830 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/b3ffcac3-ee64-440c-983d-67404e5f47fd-hm-ports\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.530901 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ffcac3-ee64-440c-983d-67404e5f47fd-config-data\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.530930 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ffcac3-ee64-440c-983d-67404e5f47fd-scripts\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.531032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b3ffcac3-ee64-440c-983d-67404e5f47fd-config-data-merged\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.531494 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b3ffcac3-ee64-440c-983d-67404e5f47fd-config-data-merged\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.532013 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/b3ffcac3-ee64-440c-983d-67404e5f47fd-hm-ports\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.536020 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ffcac3-ee64-440c-983d-67404e5f47fd-config-data\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.540306 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3ffcac3-ee64-440c-983d-67404e5f47fd-scripts\") pod \"octavia-rsyslog-gzbmf\" (UID: \"b3ffcac3-ee64-440c-983d-67404e5f47fd\") " pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.665506 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.697437 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-knqfl-config-qq79f"] Feb 19 23:03:05 crc kubenswrapper[4795]: I0219 23:03:05.706684 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-knqfl-config-qq79f"] Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.374934 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-pj6cx"] Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.380903 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.384324 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.386448 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-pj6cx"] Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.452267 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04efa30d-5580-4301-8a36-b452e949fcd3-httpd-config\") pod \"octavia-image-upload-8d4564f8f-pj6cx\" (UID: \"04efa30d-5580-4301-8a36-b452e949fcd3\") " pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.452318 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/04efa30d-5580-4301-8a36-b452e949fcd3-amphora-image\") pod \"octavia-image-upload-8d4564f8f-pj6cx\" (UID: \"04efa30d-5580-4301-8a36-b452e949fcd3\") " pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.483391 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-gzbmf"] Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.512015 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:03:06 crc kubenswrapper[4795]: E0219 23:03:06.512238 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.553589 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04efa30d-5580-4301-8a36-b452e949fcd3-httpd-config\") pod \"octavia-image-upload-8d4564f8f-pj6cx\" (UID: \"04efa30d-5580-4301-8a36-b452e949fcd3\") " pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.553649 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/04efa30d-5580-4301-8a36-b452e949fcd3-amphora-image\") pod \"octavia-image-upload-8d4564f8f-pj6cx\" (UID: \"04efa30d-5580-4301-8a36-b452e949fcd3\") " pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.555424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/04efa30d-5580-4301-8a36-b452e949fcd3-amphora-image\") pod \"octavia-image-upload-8d4564f8f-pj6cx\" (UID: \"04efa30d-5580-4301-8a36-b452e949fcd3\") " pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.567456 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04efa30d-5580-4301-8a36-b452e949fcd3-httpd-config\") pod \"octavia-image-upload-8d4564f8f-pj6cx\" (UID: \"04efa30d-5580-4301-8a36-b452e949fcd3\") " pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:06 crc kubenswrapper[4795]: I0219 23:03:06.721015 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.221268 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-pj6cx"] Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.262337 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-gzbmf" event={"ID":"b3ffcac3-ee64-440c-983d-67404e5f47fd","Type":"ContainerStarted","Data":"12cac13ec5fb25a6171cc270755d648f8163cc8154f582f842720ae0015fbed7"} Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.274031 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" event={"ID":"04efa30d-5580-4301-8a36-b452e949fcd3","Type":"ContainerStarted","Data":"9d50e6342b96a4c35e1fe26f38816a55271a38a4cb1281be352ab5ae7f179533"} Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.288523 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-z8cdz"] Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.290371 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.301027 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.317780 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-z8cdz"] Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.473624 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-combined-ca-bundle\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.473696 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-scripts\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.473816 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.473849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.526324 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d710e8ab-01d7-4137-872b-71a05ac52188" path="/var/lib/kubelet/pods/d710e8ab-01d7-4137-872b-71a05ac52188/volumes" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.527145 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-gzbmf"] Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.575945 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.576009 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.576112 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-combined-ca-bundle\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.576156 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-scripts\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.578426 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.583213 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-scripts\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.584110 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.586289 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-combined-ca-bundle\") pod \"octavia-db-sync-z8cdz\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:07 crc kubenswrapper[4795]: I0219 23:03:07.619046 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:08 crc kubenswrapper[4795]: I0219 23:03:08.115525 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-z8cdz"] Feb 19 23:03:08 crc kubenswrapper[4795]: I0219 23:03:08.282720 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-z8cdz" event={"ID":"c7895d70-3c78-4913-9028-75797e6e1dbd","Type":"ContainerStarted","Data":"14cb79bdd1caf585f83eec9d0981b74eaae9b5dab8922c52ff8baeedb37030f3"} Feb 19 23:03:09 crc kubenswrapper[4795]: I0219 23:03:09.321014 4795 generic.go:334] "Generic (PLEG): container finished" podID="c7895d70-3c78-4913-9028-75797e6e1dbd" containerID="700b20fbdb16cdb721d65035d59d19827c546e867ba32ed9f351235ca4bc0246" exitCode=0 Feb 19 23:03:09 crc kubenswrapper[4795]: I0219 23:03:09.321555 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-z8cdz" event={"ID":"c7895d70-3c78-4913-9028-75797e6e1dbd","Type":"ContainerDied","Data":"700b20fbdb16cdb721d65035d59d19827c546e867ba32ed9f351235ca4bc0246"} Feb 19 23:03:10 crc kubenswrapper[4795]: I0219 23:03:10.332526 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-gzbmf" event={"ID":"b3ffcac3-ee64-440c-983d-67404e5f47fd","Type":"ContainerStarted","Data":"4b70afb267e5c9a99026ae17ebb1f8309c23e1b1e7119d4584335d427e2d81e0"} Feb 19 23:03:10 crc kubenswrapper[4795]: I0219 23:03:10.335494 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-z8cdz" event={"ID":"c7895d70-3c78-4913-9028-75797e6e1dbd","Type":"ContainerStarted","Data":"d4c8e37b4453cd9ee00d52e7178d48381bffc05905e787ed7678728d6f9cc0ef"} Feb 19 23:03:10 crc kubenswrapper[4795]: I0219 23:03:10.371746 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-z8cdz" podStartSLOduration=3.37172876 podStartE2EDuration="3.37172876s" podCreationTimestamp="2026-02-19 23:03:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:03:10.368096687 +0000 UTC m=+5701.560614571" watchObservedRunningTime="2026-02-19 23:03:10.37172876 +0000 UTC m=+5701.564246624" Feb 19 23:03:12 crc kubenswrapper[4795]: I0219 23:03:12.167622 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:03:12 crc kubenswrapper[4795]: I0219 23:03:12.308180 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-556fc55b45-7gxcm" Feb 19 23:03:12 crc kubenswrapper[4795]: I0219 23:03:12.366464 4795 generic.go:334] "Generic (PLEG): container finished" podID="b3ffcac3-ee64-440c-983d-67404e5f47fd" containerID="4b70afb267e5c9a99026ae17ebb1f8309c23e1b1e7119d4584335d427e2d81e0" exitCode=0 Feb 19 23:03:12 crc kubenswrapper[4795]: I0219 23:03:12.368047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-gzbmf" event={"ID":"b3ffcac3-ee64-440c-983d-67404e5f47fd","Type":"ContainerDied","Data":"4b70afb267e5c9a99026ae17ebb1f8309c23e1b1e7119d4584335d427e2d81e0"} Feb 19 23:03:13 crc kubenswrapper[4795]: I0219 23:03:13.378291 4795 generic.go:334] "Generic (PLEG): container finished" podID="c7895d70-3c78-4913-9028-75797e6e1dbd" containerID="d4c8e37b4453cd9ee00d52e7178d48381bffc05905e787ed7678728d6f9cc0ef" exitCode=0 Feb 19 23:03:13 crc kubenswrapper[4795]: I0219 23:03:13.378360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-z8cdz" event={"ID":"c7895d70-3c78-4913-9028-75797e6e1dbd","Type":"ContainerDied","Data":"d4c8e37b4453cd9ee00d52e7178d48381bffc05905e787ed7678728d6f9cc0ef"} Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.677484 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.864358 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data\") pod \"c7895d70-3c78-4913-9028-75797e6e1dbd\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.864668 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-combined-ca-bundle\") pod \"c7895d70-3c78-4913-9028-75797e6e1dbd\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.864772 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged\") pod \"c7895d70-3c78-4913-9028-75797e6e1dbd\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.864843 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-scripts\") pod \"c7895d70-3c78-4913-9028-75797e6e1dbd\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.872931 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-scripts" (OuterVolumeSpecName: "scripts") pod "c7895d70-3c78-4913-9028-75797e6e1dbd" (UID: "c7895d70-3c78-4913-9028-75797e6e1dbd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.925352 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data" (OuterVolumeSpecName: "config-data") pod "c7895d70-3c78-4913-9028-75797e6e1dbd" (UID: "c7895d70-3c78-4913-9028-75797e6e1dbd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:15 crc kubenswrapper[4795]: E0219 23:03:15.926096 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged podName:c7895d70-3c78-4913-9028-75797e6e1dbd nodeName:}" failed. No retries permitted until 2026-02-19 23:03:16.426063351 +0000 UTC m=+5707.618581225 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data-merged" (UniqueName: "kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged") pod "c7895d70-3c78-4913-9028-75797e6e1dbd" (UID: "c7895d70-3c78-4913-9028-75797e6e1dbd") : error deleting /var/lib/kubelet/pods/c7895d70-3c78-4913-9028-75797e6e1dbd/volume-subpaths: remove /var/lib/kubelet/pods/c7895d70-3c78-4913-9028-75797e6e1dbd/volume-subpaths: no such file or directory Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.929974 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7895d70-3c78-4913-9028-75797e6e1dbd" (UID: "c7895d70-3c78-4913-9028-75797e6e1dbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.968653 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.968696 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:15 crc kubenswrapper[4795]: I0219 23:03:15.968705 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7895d70-3c78-4913-9028-75797e6e1dbd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.262925 4795 scope.go:117] "RemoveContainer" containerID="2119979e9a0784ca3835a55937d4bf0a118f5016d04e68e3c74e02efc1b7df06" Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.344917 4795 scope.go:117] "RemoveContainer" containerID="000fa8f747858d495d2c6d2c850ff5f2717d6c22150f68c950eeae0bca0bd7f7" Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.388541 4795 scope.go:117] "RemoveContainer" containerID="f711a9de7781799454b8f0272a2636f2533360c8ba6ce3a134936f4cf9908d61" Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.411544 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-z8cdz" event={"ID":"c7895d70-3c78-4913-9028-75797e6e1dbd","Type":"ContainerDied","Data":"14cb79bdd1caf585f83eec9d0981b74eaae9b5dab8922c52ff8baeedb37030f3"} Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.411580 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14cb79bdd1caf585f83eec9d0981b74eaae9b5dab8922c52ff8baeedb37030f3" Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.411592 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-z8cdz" Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.420583 4795 scope.go:117] "RemoveContainer" containerID="e2855b43837d8ca73ce1e28756facf9b648ea7b22ec3a7d6bc132133d332b880" Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.477250 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged\") pod \"c7895d70-3c78-4913-9028-75797e6e1dbd\" (UID: \"c7895d70-3c78-4913-9028-75797e6e1dbd\") " Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.478201 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "c7895d70-3c78-4913-9028-75797e6e1dbd" (UID: "c7895d70-3c78-4913-9028-75797e6e1dbd"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:03:16 crc kubenswrapper[4795]: I0219 23:03:16.582604 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c7895d70-3c78-4913-9028-75797e6e1dbd-config-data-merged\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:17 crc kubenswrapper[4795]: I0219 23:03:17.425034 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-gzbmf" event={"ID":"b3ffcac3-ee64-440c-983d-67404e5f47fd","Type":"ContainerStarted","Data":"1bfff6d4fd95fd8d7fa708823512b19ab2b9d4cb0dd8429c0f1342f6642f76b3"} Feb 19 23:03:17 crc kubenswrapper[4795]: I0219 23:03:17.425525 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:17 crc kubenswrapper[4795]: I0219 23:03:17.427981 4795 generic.go:334] "Generic (PLEG): container finished" podID="04efa30d-5580-4301-8a36-b452e949fcd3" containerID="17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef" exitCode=0 Feb 19 23:03:17 crc kubenswrapper[4795]: I0219 23:03:17.428034 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" event={"ID":"04efa30d-5580-4301-8a36-b452e949fcd3","Type":"ContainerDied","Data":"17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef"} Feb 19 23:03:17 crc kubenswrapper[4795]: I0219 23:03:17.447365 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-gzbmf" podStartSLOduration=2.6410139409999998 podStartE2EDuration="12.447336978s" podCreationTimestamp="2026-02-19 23:03:05 +0000 UTC" firstStartedPulling="2026-02-19 23:03:06.494408442 +0000 UTC m=+5697.686926306" lastFinishedPulling="2026-02-19 23:03:16.300731479 +0000 UTC m=+5707.493249343" observedRunningTime="2026-02-19 23:03:17.441771639 +0000 UTC m=+5708.634289503" watchObservedRunningTime="2026-02-19 23:03:17.447336978 +0000 UTC m=+5708.639854862" Feb 19 23:03:17 crc kubenswrapper[4795]: I0219 23:03:17.512666 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:03:17 crc kubenswrapper[4795]: E0219 23:03:17.512943 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:03:18 crc kubenswrapper[4795]: I0219 23:03:18.440873 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" event={"ID":"04efa30d-5580-4301-8a36-b452e949fcd3","Type":"ContainerStarted","Data":"a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d"} Feb 19 23:03:18 crc kubenswrapper[4795]: I0219 23:03:18.458507 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" podStartSLOduration=3.240983285 podStartE2EDuration="12.457989822s" podCreationTimestamp="2026-02-19 23:03:06 +0000 UTC" firstStartedPulling="2026-02-19 23:03:07.244651616 +0000 UTC m=+5698.437169480" lastFinishedPulling="2026-02-19 23:03:16.461658153 +0000 UTC m=+5707.654176017" observedRunningTime="2026-02-19 23:03:18.453906386 +0000 UTC m=+5709.646424250" watchObservedRunningTime="2026-02-19 23:03:18.457989822 +0000 UTC m=+5709.650507706" Feb 19 23:03:31 crc kubenswrapper[4795]: I0219 23:03:31.514669 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:03:31 crc kubenswrapper[4795]: E0219 23:03:31.517581 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:03:35 crc kubenswrapper[4795]: I0219 23:03:35.702400 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-gzbmf" Feb 19 23:03:39 crc kubenswrapper[4795]: I0219 23:03:39.565936 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-pj6cx"] Feb 19 23:03:39 crc kubenswrapper[4795]: I0219 23:03:39.566726 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" podUID="04efa30d-5580-4301-8a36-b452e949fcd3" containerName="octavia-amphora-httpd" containerID="cri-o://a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d" gracePeriod=30 Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.165790 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.169320 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/04efa30d-5580-4301-8a36-b452e949fcd3-amphora-image\") pod \"04efa30d-5580-4301-8a36-b452e949fcd3\" (UID: \"04efa30d-5580-4301-8a36-b452e949fcd3\") " Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.169477 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04efa30d-5580-4301-8a36-b452e949fcd3-httpd-config\") pod \"04efa30d-5580-4301-8a36-b452e949fcd3\" (UID: \"04efa30d-5580-4301-8a36-b452e949fcd3\") " Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.218527 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04efa30d-5580-4301-8a36-b452e949fcd3-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "04efa30d-5580-4301-8a36-b452e949fcd3" (UID: "04efa30d-5580-4301-8a36-b452e949fcd3"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.222575 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04efa30d-5580-4301-8a36-b452e949fcd3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "04efa30d-5580-4301-8a36-b452e949fcd3" (UID: "04efa30d-5580-4301-8a36-b452e949fcd3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.272095 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04efa30d-5580-4301-8a36-b452e949fcd3-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.272130 4795 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/04efa30d-5580-4301-8a36-b452e949fcd3-amphora-image\") on node \"crc\" DevicePath \"\"" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.643954 4795 generic.go:334] "Generic (PLEG): container finished" podID="04efa30d-5580-4301-8a36-b452e949fcd3" containerID="a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d" exitCode=0 Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.644019 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.644039 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" event={"ID":"04efa30d-5580-4301-8a36-b452e949fcd3","Type":"ContainerDied","Data":"a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d"} Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.646428 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-pj6cx" event={"ID":"04efa30d-5580-4301-8a36-b452e949fcd3","Type":"ContainerDied","Data":"9d50e6342b96a4c35e1fe26f38816a55271a38a4cb1281be352ab5ae7f179533"} Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.646450 4795 scope.go:117] "RemoveContainer" containerID="a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.672922 4795 scope.go:117] "RemoveContainer" containerID="17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.680056 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-pj6cx"] Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.689218 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-pj6cx"] Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.702489 4795 scope.go:117] "RemoveContainer" containerID="a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d" Feb 19 23:03:40 crc kubenswrapper[4795]: E0219 23:03:40.703139 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d\": container with ID starting with a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d not found: ID does not exist" containerID="a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.703202 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d"} err="failed to get container status \"a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d\": rpc error: code = NotFound desc = could not find container \"a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d\": container with ID starting with a3c2b90372f120a7eb061548f48c5800cdadbafec3663c23b6dd97d05212f21d not found: ID does not exist" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.703230 4795 scope.go:117] "RemoveContainer" containerID="17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef" Feb 19 23:03:40 crc kubenswrapper[4795]: E0219 23:03:40.703495 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef\": container with ID starting with 17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef not found: ID does not exist" containerID="17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef" Feb 19 23:03:40 crc kubenswrapper[4795]: I0219 23:03:40.703531 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef"} err="failed to get container status \"17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef\": rpc error: code = NotFound desc = could not find container \"17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef\": container with ID starting with 17e566856695530e17e7832d511ae36f5a1fc5d25825457c5e350446cc676cef not found: ID does not exist" Feb 19 23:03:41 crc kubenswrapper[4795]: I0219 23:03:41.528326 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04efa30d-5580-4301-8a36-b452e949fcd3" path="/var/lib/kubelet/pods/04efa30d-5580-4301-8a36-b452e949fcd3/volumes" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.511194 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-vp2hp"] Feb 19 23:03:42 crc kubenswrapper[4795]: E0219 23:03:42.511734 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04efa30d-5580-4301-8a36-b452e949fcd3" containerName="octavia-amphora-httpd" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.511750 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="04efa30d-5580-4301-8a36-b452e949fcd3" containerName="octavia-amphora-httpd" Feb 19 23:03:42 crc kubenswrapper[4795]: E0219 23:03:42.511777 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7895d70-3c78-4913-9028-75797e6e1dbd" containerName="init" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.511787 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7895d70-3c78-4913-9028-75797e6e1dbd" containerName="init" Feb 19 23:03:42 crc kubenswrapper[4795]: E0219 23:03:42.511809 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7895d70-3c78-4913-9028-75797e6e1dbd" containerName="octavia-db-sync" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.511818 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7895d70-3c78-4913-9028-75797e6e1dbd" containerName="octavia-db-sync" Feb 19 23:03:42 crc kubenswrapper[4795]: E0219 23:03:42.511838 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04efa30d-5580-4301-8a36-b452e949fcd3" containerName="init" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.511847 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="04efa30d-5580-4301-8a36-b452e949fcd3" containerName="init" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.512066 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="04efa30d-5580-4301-8a36-b452e949fcd3" containerName="octavia-amphora-httpd" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.512079 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7895d70-3c78-4913-9028-75797e6e1dbd" containerName="octavia-db-sync" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.514553 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.517522 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.542836 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-vp2hp"] Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.617665 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/73c5ad0c-a7f2-414d-a1f8-041a807d82b9-amphora-image\") pod \"octavia-image-upload-8d4564f8f-vp2hp\" (UID: \"73c5ad0c-a7f2-414d-a1f8-041a807d82b9\") " pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.617757 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73c5ad0c-a7f2-414d-a1f8-041a807d82b9-httpd-config\") pod \"octavia-image-upload-8d4564f8f-vp2hp\" (UID: \"73c5ad0c-a7f2-414d-a1f8-041a807d82b9\") " pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.722114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/73c5ad0c-a7f2-414d-a1f8-041a807d82b9-amphora-image\") pod \"octavia-image-upload-8d4564f8f-vp2hp\" (UID: \"73c5ad0c-a7f2-414d-a1f8-041a807d82b9\") " pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.722199 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73c5ad0c-a7f2-414d-a1f8-041a807d82b9-httpd-config\") pod \"octavia-image-upload-8d4564f8f-vp2hp\" (UID: \"73c5ad0c-a7f2-414d-a1f8-041a807d82b9\") " pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.722854 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/73c5ad0c-a7f2-414d-a1f8-041a807d82b9-amphora-image\") pod \"octavia-image-upload-8d4564f8f-vp2hp\" (UID: \"73c5ad0c-a7f2-414d-a1f8-041a807d82b9\") " pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.731536 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/73c5ad0c-a7f2-414d-a1f8-041a807d82b9-httpd-config\") pod \"octavia-image-upload-8d4564f8f-vp2hp\" (UID: \"73c5ad0c-a7f2-414d-a1f8-041a807d82b9\") " pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" Feb 19 23:03:42 crc kubenswrapper[4795]: I0219 23:03:42.854686 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" Feb 19 23:03:43 crc kubenswrapper[4795]: I0219 23:03:43.364147 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-8d4564f8f-vp2hp"] Feb 19 23:03:43 crc kubenswrapper[4795]: I0219 23:03:43.672329 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" event={"ID":"73c5ad0c-a7f2-414d-a1f8-041a807d82b9","Type":"ContainerStarted","Data":"3ce38c4982eed68631a5ca80791e20aaf553f425b0dd44ffa40018ca0b479c31"} Feb 19 23:03:44 crc kubenswrapper[4795]: I0219 23:03:44.511877 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:03:44 crc kubenswrapper[4795]: E0219 23:03:44.512660 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:03:44 crc kubenswrapper[4795]: I0219 23:03:44.685190 4795 generic.go:334] "Generic (PLEG): container finished" podID="73c5ad0c-a7f2-414d-a1f8-041a807d82b9" containerID="5dc328a328718ac247257d6a6628622c81badfd14393f0cf9cd608946e9727fe" exitCode=0 Feb 19 23:03:44 crc kubenswrapper[4795]: I0219 23:03:44.685256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" event={"ID":"73c5ad0c-a7f2-414d-a1f8-041a807d82b9","Type":"ContainerDied","Data":"5dc328a328718ac247257d6a6628622c81badfd14393f0cf9cd608946e9727fe"} Feb 19 23:03:45 crc kubenswrapper[4795]: I0219 23:03:45.694888 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" event={"ID":"73c5ad0c-a7f2-414d-a1f8-041a807d82b9","Type":"ContainerStarted","Data":"a0a495de52f9506a01aa5e9822d5ed0280e10e8b48c5690a8d17ca946352315f"} Feb 19 23:03:50 crc kubenswrapper[4795]: I0219 23:03:50.940598 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-8d4564f8f-vp2hp" podStartSLOduration=8.519854308 podStartE2EDuration="8.940577975s" podCreationTimestamp="2026-02-19 23:03:42 +0000 UTC" firstStartedPulling="2026-02-19 23:03:43.364614037 +0000 UTC m=+5734.557131911" lastFinishedPulling="2026-02-19 23:03:43.785337714 +0000 UTC m=+5734.977855578" observedRunningTime="2026-02-19 23:03:45.720666068 +0000 UTC m=+5736.913183922" watchObservedRunningTime="2026-02-19 23:03:50.940577975 +0000 UTC m=+5742.133095849" Feb 19 23:03:50 crc kubenswrapper[4795]: I0219 23:03:50.944927 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ggcwd"] Feb 19 23:03:50 crc kubenswrapper[4795]: I0219 23:03:50.948651 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:50 crc kubenswrapper[4795]: I0219 23:03:50.958388 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggcwd"] Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.085461 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-utilities\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.085604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-catalog-content\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.085629 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl9s7\" (UniqueName: \"kubernetes.io/projected/604a8d08-540f-450f-ae5a-f627d2023851-kube-api-access-dl9s7\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.187009 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-catalog-content\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.187061 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl9s7\" (UniqueName: \"kubernetes.io/projected/604a8d08-540f-450f-ae5a-f627d2023851-kube-api-access-dl9s7\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.187159 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-utilities\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.187558 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-catalog-content\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.187627 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-utilities\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.212634 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl9s7\" (UniqueName: \"kubernetes.io/projected/604a8d08-540f-450f-ae5a-f627d2023851-kube-api-access-dl9s7\") pod \"redhat-marketplace-ggcwd\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.285542 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:03:51 crc kubenswrapper[4795]: I0219 23:03:51.757356 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggcwd"] Feb 19 23:03:52 crc kubenswrapper[4795]: I0219 23:03:52.774716 4795 generic.go:334] "Generic (PLEG): container finished" podID="604a8d08-540f-450f-ae5a-f627d2023851" containerID="d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14" exitCode=0 Feb 19 23:03:52 crc kubenswrapper[4795]: I0219 23:03:52.774757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggcwd" event={"ID":"604a8d08-540f-450f-ae5a-f627d2023851","Type":"ContainerDied","Data":"d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14"} Feb 19 23:03:52 crc kubenswrapper[4795]: I0219 23:03:52.774962 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggcwd" event={"ID":"604a8d08-540f-450f-ae5a-f627d2023851","Type":"ContainerStarted","Data":"9d38bc5478406d8bb61b38691f880d15c257a58823105f014b66c85569ab6ae9"} Feb 19 23:03:53 crc kubenswrapper[4795]: I0219 23:03:53.784953 4795 generic.go:334] "Generic (PLEG): container finished" podID="604a8d08-540f-450f-ae5a-f627d2023851" containerID="11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50" exitCode=0 Feb 19 23:03:53 crc kubenswrapper[4795]: I0219 23:03:53.785030 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggcwd" event={"ID":"604a8d08-540f-450f-ae5a-f627d2023851","Type":"ContainerDied","Data":"11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50"} Feb 19 23:03:54 crc kubenswrapper[4795]: I0219 23:03:54.800624 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggcwd" event={"ID":"604a8d08-540f-450f-ae5a-f627d2023851","Type":"ContainerStarted","Data":"eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115"} Feb 19 23:03:54 crc kubenswrapper[4795]: I0219 23:03:54.839085 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ggcwd" podStartSLOduration=3.400687603 podStartE2EDuration="4.839062664s" podCreationTimestamp="2026-02-19 23:03:50 +0000 UTC" firstStartedPulling="2026-02-19 23:03:52.777508922 +0000 UTC m=+5743.970026786" lastFinishedPulling="2026-02-19 23:03:54.215883993 +0000 UTC m=+5745.408401847" observedRunningTime="2026-02-19 23:03:54.82871624 +0000 UTC m=+5746.021234124" watchObservedRunningTime="2026-02-19 23:03:54.839062664 +0000 UTC m=+5746.031580548" Feb 19 23:03:56 crc kubenswrapper[4795]: I0219 23:03:56.511344 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:03:56 crc kubenswrapper[4795]: E0219 23:03:56.512807 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:04:01 crc kubenswrapper[4795]: I0219 23:04:01.289538 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:04:01 crc kubenswrapper[4795]: I0219 23:04:01.291232 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:04:01 crc kubenswrapper[4795]: I0219 23:04:01.353017 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:04:01 crc kubenswrapper[4795]: I0219 23:04:01.928493 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:04:02 crc kubenswrapper[4795]: I0219 23:04:02.007460 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggcwd"] Feb 19 23:04:03 crc kubenswrapper[4795]: I0219 23:04:03.880315 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ggcwd" podUID="604a8d08-540f-450f-ae5a-f627d2023851" containerName="registry-server" containerID="cri-o://eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115" gracePeriod=2 Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.355775 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.460647 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-catalog-content\") pod \"604a8d08-540f-450f-ae5a-f627d2023851\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.460865 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-utilities\") pod \"604a8d08-540f-450f-ae5a-f627d2023851\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.461056 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl9s7\" (UniqueName: \"kubernetes.io/projected/604a8d08-540f-450f-ae5a-f627d2023851-kube-api-access-dl9s7\") pod \"604a8d08-540f-450f-ae5a-f627d2023851\" (UID: \"604a8d08-540f-450f-ae5a-f627d2023851\") " Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.461663 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-utilities" (OuterVolumeSpecName: "utilities") pod "604a8d08-540f-450f-ae5a-f627d2023851" (UID: "604a8d08-540f-450f-ae5a-f627d2023851"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.466772 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604a8d08-540f-450f-ae5a-f627d2023851-kube-api-access-dl9s7" (OuterVolumeSpecName: "kube-api-access-dl9s7") pod "604a8d08-540f-450f-ae5a-f627d2023851" (UID: "604a8d08-540f-450f-ae5a-f627d2023851"). InnerVolumeSpecName "kube-api-access-dl9s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.482899 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "604a8d08-540f-450f-ae5a-f627d2023851" (UID: "604a8d08-540f-450f-ae5a-f627d2023851"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.563330 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.563368 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl9s7\" (UniqueName: \"kubernetes.io/projected/604a8d08-540f-450f-ae5a-f627d2023851-kube-api-access-dl9s7\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.563377 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/604a8d08-540f-450f-ae5a-f627d2023851-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.891565 4795 generic.go:334] "Generic (PLEG): container finished" podID="604a8d08-540f-450f-ae5a-f627d2023851" containerID="eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115" exitCode=0 Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.891607 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggcwd" event={"ID":"604a8d08-540f-450f-ae5a-f627d2023851","Type":"ContainerDied","Data":"eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115"} Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.891660 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ggcwd" event={"ID":"604a8d08-540f-450f-ae5a-f627d2023851","Type":"ContainerDied","Data":"9d38bc5478406d8bb61b38691f880d15c257a58823105f014b66c85569ab6ae9"} Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.891676 4795 scope.go:117] "RemoveContainer" containerID="eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.891811 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ggcwd" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.918934 4795 scope.go:117] "RemoveContainer" containerID="11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.949871 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggcwd"] Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.966458 4795 scope.go:117] "RemoveContainer" containerID="d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.972524 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ggcwd"] Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.996243 4795 scope.go:117] "RemoveContainer" containerID="eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115" Feb 19 23:04:04 crc kubenswrapper[4795]: E0219 23:04:04.996711 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115\": container with ID starting with eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115 not found: ID does not exist" containerID="eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.996803 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115"} err="failed to get container status \"eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115\": rpc error: code = NotFound desc = could not find container \"eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115\": container with ID starting with eef135ebd487d307fa54d1f6a09d89b69d4371f333a48251e8a62d22b2127115 not found: ID does not exist" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.996880 4795 scope.go:117] "RemoveContainer" containerID="11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50" Feb 19 23:04:04 crc kubenswrapper[4795]: E0219 23:04:04.997273 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50\": container with ID starting with 11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50 not found: ID does not exist" containerID="11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.997335 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50"} err="failed to get container status \"11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50\": rpc error: code = NotFound desc = could not find container \"11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50\": container with ID starting with 11f02dc00e80c0fcdb142b7d1dea8181c76ed1f4ee204ff1b653f00c55c62c50 not found: ID does not exist" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.997372 4795 scope.go:117] "RemoveContainer" containerID="d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14" Feb 19 23:04:04 crc kubenswrapper[4795]: E0219 23:04:04.997831 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14\": container with ID starting with d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14 not found: ID does not exist" containerID="d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14" Feb 19 23:04:04 crc kubenswrapper[4795]: I0219 23:04:04.997907 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14"} err="failed to get container status \"d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14\": rpc error: code = NotFound desc = could not find container \"d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14\": container with ID starting with d902470a19ebe7417abce2843271bb85008f8e47eb2f80f53409ec3c0c5d3d14 not found: ID does not exist" Feb 19 23:04:05 crc kubenswrapper[4795]: I0219 23:04:05.522484 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="604a8d08-540f-450f-ae5a-f627d2023851" path="/var/lib/kubelet/pods/604a8d08-540f-450f-ae5a-f627d2023851/volumes" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.498583 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-g59hr"] Feb 19 23:04:09 crc kubenswrapper[4795]: E0219 23:04:09.502695 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604a8d08-540f-450f-ae5a-f627d2023851" containerName="extract-content" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.502741 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="604a8d08-540f-450f-ae5a-f627d2023851" containerName="extract-content" Feb 19 23:04:09 crc kubenswrapper[4795]: E0219 23:04:09.502947 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604a8d08-540f-450f-ae5a-f627d2023851" containerName="registry-server" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.502963 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="604a8d08-540f-450f-ae5a-f627d2023851" containerName="registry-server" Feb 19 23:04:09 crc kubenswrapper[4795]: E0219 23:04:09.503012 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604a8d08-540f-450f-ae5a-f627d2023851" containerName="extract-utilities" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.503021 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="604a8d08-540f-450f-ae5a-f627d2023851" containerName="extract-utilities" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.505905 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="604a8d08-540f-450f-ae5a-f627d2023851" containerName="registry-server" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.509754 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.520761 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.522884 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.527495 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.563412 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-g59hr"] Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.657249 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-scripts\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.657292 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-config-data\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.657344 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-combined-ca-bundle\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.657385 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/107db266-c130-4312-be67-ffe75016fd44-config-data-merged\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.657451 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/107db266-c130-4312-be67-ffe75016fd44-hm-ports\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.657482 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-amphora-certs\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.758626 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-scripts\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.758919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-config-data\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.758961 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-combined-ca-bundle\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.759003 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/107db266-c130-4312-be67-ffe75016fd44-config-data-merged\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.759069 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/107db266-c130-4312-be67-ffe75016fd44-hm-ports\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.759116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-amphora-certs\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.759950 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/107db266-c130-4312-be67-ffe75016fd44-config-data-merged\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.760597 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/107db266-c130-4312-be67-ffe75016fd44-hm-ports\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.766134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-config-data\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.769032 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-combined-ca-bundle\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.769136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-amphora-certs\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.779799 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/107db266-c130-4312-be67-ffe75016fd44-scripts\") pod \"octavia-healthmanager-g59hr\" (UID: \"107db266-c130-4312-be67-ffe75016fd44\") " pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:09 crc kubenswrapper[4795]: I0219 23:04:09.856874 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:10 crc kubenswrapper[4795]: I0219 23:04:10.469812 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-g59hr"] Feb 19 23:04:10 crc kubenswrapper[4795]: I0219 23:04:10.945401 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-g59hr" event={"ID":"107db266-c130-4312-be67-ffe75016fd44","Type":"ContainerStarted","Data":"28f69fbcb2e7fa6bbca67e47ba798af54fc3eb6462fa131a9d6bebe6df1bb316"} Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.511719 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:04:11 crc kubenswrapper[4795]: E0219 23:04:11.512397 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.738287 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-cm4g6"] Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.739978 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.743620 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.744027 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.758561 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-cm4g6"] Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.907011 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7167d9ee-5127-43c9-957a-598d9dcfecb3-config-data-merged\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.907046 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-amphora-certs\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.907077 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-combined-ca-bundle\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.907157 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-config-data\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.907263 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7167d9ee-5127-43c9-957a-598d9dcfecb3-hm-ports\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.907287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-scripts\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:11 crc kubenswrapper[4795]: I0219 23:04:11.957239 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-g59hr" event={"ID":"107db266-c130-4312-be67-ffe75016fd44","Type":"ContainerStarted","Data":"107a2524974dddf0c18c6e83d9afecbe178f926101f7f9b4b0ed708d237cc506"} Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.008720 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7167d9ee-5127-43c9-957a-598d9dcfecb3-hm-ports\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.008772 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-scripts\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.009186 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7167d9ee-5127-43c9-957a-598d9dcfecb3-config-data-merged\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.009217 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-amphora-certs\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.009268 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-combined-ca-bundle\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.009301 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-config-data\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.009953 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/7167d9ee-5127-43c9-957a-598d9dcfecb3-hm-ports\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.010013 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7167d9ee-5127-43c9-957a-598d9dcfecb3-config-data-merged\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.015521 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-config-data\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.015690 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-amphora-certs\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.016296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-scripts\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.022919 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7167d9ee-5127-43c9-957a-598d9dcfecb3-combined-ca-bundle\") pod \"octavia-housekeeping-cm4g6\" (UID: \"7167d9ee-5127-43c9-957a-598d9dcfecb3\") " pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.056797 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.709476 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-cm4g6"] Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.723212 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.968070 4795 generic.go:334] "Generic (PLEG): container finished" podID="107db266-c130-4312-be67-ffe75016fd44" containerID="107a2524974dddf0c18c6e83d9afecbe178f926101f7f9b4b0ed708d237cc506" exitCode=0 Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.968134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-g59hr" event={"ID":"107db266-c130-4312-be67-ffe75016fd44","Type":"ContainerDied","Data":"107a2524974dddf0c18c6e83d9afecbe178f926101f7f9b4b0ed708d237cc506"} Feb 19 23:04:12 crc kubenswrapper[4795]: I0219 23:04:12.972277 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-cm4g6" event={"ID":"7167d9ee-5127-43c9-957a-598d9dcfecb3","Type":"ContainerStarted","Data":"bdf026767620b1d93453c9606f54712db3d1d189eae8b3852b6cd91396f98126"} Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.649084 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-pktjg"] Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.651581 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.656544 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.656566 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.670534 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-pktjg"] Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.754235 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-amphora-certs\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.760583 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cfbfd9d0-564e-41d0-8171-5f32f380a3df-config-data-merged\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.760736 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-scripts\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.761002 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cfbfd9d0-564e-41d0-8171-5f32f380a3df-hm-ports\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.761129 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-combined-ca-bundle\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.761283 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-config-data\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.863311 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-config-data\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.863388 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-amphora-certs\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.863413 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cfbfd9d0-564e-41d0-8171-5f32f380a3df-config-data-merged\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.863447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-scripts\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.864122 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cfbfd9d0-564e-41d0-8171-5f32f380a3df-config-data-merged\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.864619 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cfbfd9d0-564e-41d0-8171-5f32f380a3df-hm-ports\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.865598 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cfbfd9d0-564e-41d0-8171-5f32f380a3df-hm-ports\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.865737 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-combined-ca-bundle\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.868943 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-combined-ca-bundle\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.872550 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-config-data\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.873156 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-scripts\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.874394 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cfbfd9d0-564e-41d0-8171-5f32f380a3df-amphora-certs\") pod \"octavia-worker-pktjg\" (UID: \"cfbfd9d0-564e-41d0-8171-5f32f380a3df\") " pod="openstack/octavia-worker-pktjg" Feb 19 23:04:13 crc kubenswrapper[4795]: I0219 23:04:13.992087 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-pktjg" Feb 19 23:04:14 crc kubenswrapper[4795]: I0219 23:04:14.004643 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-g59hr" event={"ID":"107db266-c130-4312-be67-ffe75016fd44","Type":"ContainerStarted","Data":"4ae2b472afcfd0f46d4752b9c1fda3f7a0a7b5b43c040a89febfab65f4737409"} Feb 19 23:04:14 crc kubenswrapper[4795]: I0219 23:04:14.005329 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:14 crc kubenswrapper[4795]: I0219 23:04:14.032776 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-g59hr" podStartSLOduration=5.032758171 podStartE2EDuration="5.032758171s" podCreationTimestamp="2026-02-19 23:04:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:04:14.02112709 +0000 UTC m=+5765.213644954" watchObservedRunningTime="2026-02-19 23:04:14.032758171 +0000 UTC m=+5765.225276035" Feb 19 23:04:14 crc kubenswrapper[4795]: I0219 23:04:14.747400 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-pktjg"] Feb 19 23:04:14 crc kubenswrapper[4795]: W0219 23:04:14.762235 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfbfd9d0_564e_41d0_8171_5f32f380a3df.slice/crio-e179c81ae7b94c11e346ae5700a062a7f4fba241804b5adacc27b71f989fb28e WatchSource:0}: Error finding container e179c81ae7b94c11e346ae5700a062a7f4fba241804b5adacc27b71f989fb28e: Status 404 returned error can't find the container with id e179c81ae7b94c11e346ae5700a062a7f4fba241804b5adacc27b71f989fb28e Feb 19 23:04:14 crc kubenswrapper[4795]: I0219 23:04:14.835146 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-g59hr"] Feb 19 23:04:15 crc kubenswrapper[4795]: I0219 23:04:15.016132 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-cm4g6" event={"ID":"7167d9ee-5127-43c9-957a-598d9dcfecb3","Type":"ContainerStarted","Data":"1accc9d526d6fda6c3e60fa5ba08048673c2630dfb31c567d1cbf3b6e78ba1c9"} Feb 19 23:04:15 crc kubenswrapper[4795]: I0219 23:04:15.018129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-pktjg" event={"ID":"cfbfd9d0-564e-41d0-8171-5f32f380a3df","Type":"ContainerStarted","Data":"e179c81ae7b94c11e346ae5700a062a7f4fba241804b5adacc27b71f989fb28e"} Feb 19 23:04:16 crc kubenswrapper[4795]: I0219 23:04:16.033139 4795 generic.go:334] "Generic (PLEG): container finished" podID="7167d9ee-5127-43c9-957a-598d9dcfecb3" containerID="1accc9d526d6fda6c3e60fa5ba08048673c2630dfb31c567d1cbf3b6e78ba1c9" exitCode=0 Feb 19 23:04:16 crc kubenswrapper[4795]: I0219 23:04:16.033330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-cm4g6" event={"ID":"7167d9ee-5127-43c9-957a-598d9dcfecb3","Type":"ContainerDied","Data":"1accc9d526d6fda6c3e60fa5ba08048673c2630dfb31c567d1cbf3b6e78ba1c9"} Feb 19 23:04:17 crc kubenswrapper[4795]: I0219 23:04:17.044774 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:17 crc kubenswrapper[4795]: I0219 23:04:17.071220 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-cm4g6" podStartSLOduration=4.623196604 podStartE2EDuration="6.071203387s" podCreationTimestamp="2026-02-19 23:04:11 +0000 UTC" firstStartedPulling="2026-02-19 23:04:12.72275659 +0000 UTC m=+5763.915274494" lastFinishedPulling="2026-02-19 23:04:14.170763413 +0000 UTC m=+5765.363281277" observedRunningTime="2026-02-19 23:04:17.067520242 +0000 UTC m=+5768.260038126" watchObservedRunningTime="2026-02-19 23:04:17.071203387 +0000 UTC m=+5768.263721251" Feb 19 23:04:18 crc kubenswrapper[4795]: I0219 23:04:18.054297 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-cm4g6" event={"ID":"7167d9ee-5127-43c9-957a-598d9dcfecb3","Type":"ContainerStarted","Data":"ddead98e84d71253645b8249ecc7cd9d813a41ec08a5fb7c30e015c4c879ed6e"} Feb 19 23:04:18 crc kubenswrapper[4795]: I0219 23:04:18.055767 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfbfd9d0-564e-41d0-8171-5f32f380a3df" containerID="7b24faaea42b781c594aaea87b574d69779fd11ee1b15c67f94b3fb5af59435a" exitCode=0 Feb 19 23:04:18 crc kubenswrapper[4795]: I0219 23:04:18.055811 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-pktjg" event={"ID":"cfbfd9d0-564e-41d0-8171-5f32f380a3df","Type":"ContainerDied","Data":"7b24faaea42b781c594aaea87b574d69779fd11ee1b15c67f94b3fb5af59435a"} Feb 19 23:04:20 crc kubenswrapper[4795]: I0219 23:04:20.074944 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-pktjg" event={"ID":"cfbfd9d0-564e-41d0-8171-5f32f380a3df","Type":"ContainerStarted","Data":"b3ab6e660a9f4d36d1c9e3c306b6ac94d1db18c70c8b4cb1f60cbf46841f718f"} Feb 19 23:04:20 crc kubenswrapper[4795]: I0219 23:04:20.075653 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-pktjg" Feb 19 23:04:20 crc kubenswrapper[4795]: I0219 23:04:20.108452 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-pktjg" podStartSLOduration=5.292556709 podStartE2EDuration="7.108434319s" podCreationTimestamp="2026-02-19 23:04:13 +0000 UTC" firstStartedPulling="2026-02-19 23:04:14.765548438 +0000 UTC m=+5765.958066292" lastFinishedPulling="2026-02-19 23:04:16.581426028 +0000 UTC m=+5767.773943902" observedRunningTime="2026-02-19 23:04:20.098971481 +0000 UTC m=+5771.291489375" watchObservedRunningTime="2026-02-19 23:04:20.108434319 +0000 UTC m=+5771.300952183" Feb 19 23:04:22 crc kubenswrapper[4795]: I0219 23:04:22.513498 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:04:22 crc kubenswrapper[4795]: E0219 23:04:22.514354 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:04:24 crc kubenswrapper[4795]: I0219 23:04:24.883189 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-g59hr" Feb 19 23:04:27 crc kubenswrapper[4795]: I0219 23:04:27.093984 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-cm4g6" Feb 19 23:04:29 crc kubenswrapper[4795]: I0219 23:04:29.034709 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-pktjg" Feb 19 23:04:37 crc kubenswrapper[4795]: I0219 23:04:37.512270 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:04:37 crc kubenswrapper[4795]: E0219 23:04:37.513130 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.521499 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-79bb8b759c-wxdkj"] Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.523567 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.529255 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79bb8b759c-wxdkj"] Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.531957 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.531961 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.532424 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-s4gpl" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.532631 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.570946 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.571591 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerName="glance-log" containerID="cri-o://e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58" gracePeriod=30 Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.573181 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerName="glance-httpd" containerID="cri-o://f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648" gracePeriod=30 Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.638727 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.641662 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-log" containerID="cri-o://2881a41d4e90cea66bb06cc15746e467948606aaeb4ab6378ba25eec8d520511" gracePeriod=30 Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.641683 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-httpd" containerID="cri-o://1b16e4f15dda00c47ebd1d3a052f48e0eb759759e8c054d1aebe9ed9f2d0750b" gracePeriod=30 Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.690769 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwmwt\" (UniqueName: \"kubernetes.io/projected/4ab2c6ea-997b-4147-a0de-5e3989980973-kube-api-access-hwmwt\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.690858 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-config-data\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.690955 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4ab2c6ea-997b-4147-a0de-5e3989980973-horizon-secret-key\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.691000 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-scripts\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.691084 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab2c6ea-997b-4147-a0de-5e3989980973-logs\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.694159 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-844c94496f-brkd8"] Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.696997 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.724090 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-844c94496f-brkd8"] Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793325 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6xf7\" (UniqueName: \"kubernetes.io/projected/fb48bdd6-abf1-4115-8357-79c56555d51b-kube-api-access-p6xf7\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793393 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-scripts\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793446 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-scripts\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793504 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb48bdd6-abf1-4115-8357-79c56555d51b-logs\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793533 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fb48bdd6-abf1-4115-8357-79c56555d51b-horizon-secret-key\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793558 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab2c6ea-997b-4147-a0de-5e3989980973-logs\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793585 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-config-data\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793626 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwmwt\" (UniqueName: \"kubernetes.io/projected/4ab2c6ea-997b-4147-a0de-5e3989980973-kube-api-access-hwmwt\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793698 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-config-data\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.793773 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4ab2c6ea-997b-4147-a0de-5e3989980973-horizon-secret-key\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.794503 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab2c6ea-997b-4147-a0de-5e3989980973-logs\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.795721 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-config-data\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.796312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-scripts\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.802431 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4ab2c6ea-997b-4147-a0de-5e3989980973-horizon-secret-key\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.816592 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwmwt\" (UniqueName: \"kubernetes.io/projected/4ab2c6ea-997b-4147-a0de-5e3989980973-kube-api-access-hwmwt\") pod \"horizon-79bb8b759c-wxdkj\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.851640 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.899196 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-scripts\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.899787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb48bdd6-abf1-4115-8357-79c56555d51b-logs\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.899811 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fb48bdd6-abf1-4115-8357-79c56555d51b-horizon-secret-key\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.899834 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-config-data\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.899952 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6xf7\" (UniqueName: \"kubernetes.io/projected/fb48bdd6-abf1-4115-8357-79c56555d51b-kube-api-access-p6xf7\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.900095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-scripts\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.901439 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb48bdd6-abf1-4115-8357-79c56555d51b-logs\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.903022 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-config-data\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.907772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fb48bdd6-abf1-4115-8357-79c56555d51b-horizon-secret-key\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:45 crc kubenswrapper[4795]: I0219 23:04:45.915643 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6xf7\" (UniqueName: \"kubernetes.io/projected/fb48bdd6-abf1-4115-8357-79c56555d51b-kube-api-access-p6xf7\") pod \"horizon-844c94496f-brkd8\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.108623 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.337281 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79bb8b759c-wxdkj"] Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.352026 4795 generic.go:334] "Generic (PLEG): container finished" podID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerID="e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58" exitCode=143 Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.352101 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ba19509-98fd-4ae4-b9ab-673c27ab8e85","Type":"ContainerDied","Data":"e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58"} Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.355964 4795 generic.go:334] "Generic (PLEG): container finished" podID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerID="2881a41d4e90cea66bb06cc15746e467948606aaeb4ab6378ba25eec8d520511" exitCode=143 Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.355999 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc","Type":"ContainerDied","Data":"2881a41d4e90cea66bb06cc15746e467948606aaeb4ab6378ba25eec8d520511"} Feb 19 23:04:46 crc kubenswrapper[4795]: W0219 23:04:46.393558 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ab2c6ea_997b_4147_a0de_5e3989980973.slice/crio-79805600d19571dc97c2bb6f4d14290da224c25127f75720d4481d84a77147c0 WatchSource:0}: Error finding container 79805600d19571dc97c2bb6f4d14290da224c25127f75720d4481d84a77147c0: Status 404 returned error can't find the container with id 79805600d19571dc97c2bb6f4d14290da224c25127f75720d4481d84a77147c0 Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.393752 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-548bf4c685-852ql"] Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.395644 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.423301 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79bb8b759c-wxdkj"] Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.434143 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-548bf4c685-852ql"] Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.514225 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-scripts\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.514295 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-config-data\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.514332 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/353e54a0-06cb-4876-af76-78bcd1bb3a22-logs\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.514582 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/353e54a0-06cb-4876-af76-78bcd1bb3a22-horizon-secret-key\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.514892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svjbq\" (UniqueName: \"kubernetes.io/projected/353e54a0-06cb-4876-af76-78bcd1bb3a22-kube-api-access-svjbq\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.617042 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svjbq\" (UniqueName: \"kubernetes.io/projected/353e54a0-06cb-4876-af76-78bcd1bb3a22-kube-api-access-svjbq\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.617137 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-scripts\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.617207 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-config-data\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.617263 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/353e54a0-06cb-4876-af76-78bcd1bb3a22-logs\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.617379 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/353e54a0-06cb-4876-af76-78bcd1bb3a22-horizon-secret-key\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.618148 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-scripts\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.618393 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/353e54a0-06cb-4876-af76-78bcd1bb3a22-logs\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.618785 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-config-data\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.624669 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/353e54a0-06cb-4876-af76-78bcd1bb3a22-horizon-secret-key\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.637527 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svjbq\" (UniqueName: \"kubernetes.io/projected/353e54a0-06cb-4876-af76-78bcd1bb3a22-kube-api-access-svjbq\") pod \"horizon-548bf4c685-852ql\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.705157 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-844c94496f-brkd8"] Feb 19 23:04:46 crc kubenswrapper[4795]: W0219 23:04:46.713908 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb48bdd6_abf1_4115_8357_79c56555d51b.slice/crio-a28eb35b4d570794611599731f49604cc11a5fca584746030e04c1f904b4cff5 WatchSource:0}: Error finding container a28eb35b4d570794611599731f49604cc11a5fca584746030e04c1f904b4cff5: Status 404 returned error can't find the container with id a28eb35b4d570794611599731f49604cc11a5fca584746030e04c1f904b4cff5 Feb 19 23:04:46 crc kubenswrapper[4795]: I0219 23:04:46.747949 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:47 crc kubenswrapper[4795]: I0219 23:04:47.230914 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-548bf4c685-852ql"] Feb 19 23:04:47 crc kubenswrapper[4795]: W0219 23:04:47.234903 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod353e54a0_06cb_4876_af76_78bcd1bb3a22.slice/crio-582b797ed9c1b4bfe8384f717fb4dee523c0a67a75d13d2ff254fdbeb84f1617 WatchSource:0}: Error finding container 582b797ed9c1b4bfe8384f717fb4dee523c0a67a75d13d2ff254fdbeb84f1617: Status 404 returned error can't find the container with id 582b797ed9c1b4bfe8384f717fb4dee523c0a67a75d13d2ff254fdbeb84f1617 Feb 19 23:04:47 crc kubenswrapper[4795]: I0219 23:04:47.368355 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bb8b759c-wxdkj" event={"ID":"4ab2c6ea-997b-4147-a0de-5e3989980973","Type":"ContainerStarted","Data":"79805600d19571dc97c2bb6f4d14290da224c25127f75720d4481d84a77147c0"} Feb 19 23:04:47 crc kubenswrapper[4795]: I0219 23:04:47.369892 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c94496f-brkd8" event={"ID":"fb48bdd6-abf1-4115-8357-79c56555d51b","Type":"ContainerStarted","Data":"a28eb35b4d570794611599731f49604cc11a5fca584746030e04c1f904b4cff5"} Feb 19 23:04:47 crc kubenswrapper[4795]: I0219 23:04:47.371172 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548bf4c685-852ql" event={"ID":"353e54a0-06cb-4876-af76-78bcd1bb3a22","Type":"ContainerStarted","Data":"582b797ed9c1b4bfe8384f717fb4dee523c0a67a75d13d2ff254fdbeb84f1617"} Feb 19 23:04:48 crc kubenswrapper[4795]: I0219 23:04:48.847507 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.43:9292/healthcheck\": read tcp 10.217.0.2:55430->10.217.1.43:9292: read: connection reset by peer" Feb 19 23:04:48 crc kubenswrapper[4795]: I0219 23:04:48.847542 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.43:9292/healthcheck\": read tcp 10.217.0.2:55422->10.217.1.43:9292: read: connection reset by peer" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.350790 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.399019 4795 generic.go:334] "Generic (PLEG): container finished" podID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerID="f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648" exitCode=0 Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.399068 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ba19509-98fd-4ae4-b9ab-673c27ab8e85","Type":"ContainerDied","Data":"f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648"} Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.399107 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.399131 4795 scope.go:117] "RemoveContainer" containerID="f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.399119 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5ba19509-98fd-4ae4-b9ab-673c27ab8e85","Type":"ContainerDied","Data":"80a194040524fc06f644957b6fb02ec46c953aefd62778b8fadace961f88d1ac"} Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.407708 4795 generic.go:334] "Generic (PLEG): container finished" podID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerID="1b16e4f15dda00c47ebd1d3a052f48e0eb759759e8c054d1aebe9ed9f2d0750b" exitCode=0 Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.407747 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc","Type":"ContainerDied","Data":"1b16e4f15dda00c47ebd1d3a052f48e0eb759759e8c054d1aebe9ed9f2d0750b"} Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.442865 4795 scope.go:117] "RemoveContainer" containerID="e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.479057 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srnpd\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-kube-api-access-srnpd\") pod \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.479154 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-httpd-run\") pod \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.479316 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-config-data\") pod \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.479527 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-scripts\") pod \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.479565 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-logs\") pod \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.479618 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-ceph\") pod \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.479712 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-combined-ca-bundle\") pod \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\" (UID: \"5ba19509-98fd-4ae4-b9ab-673c27ab8e85\") " Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.481599 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5ba19509-98fd-4ae4-b9ab-673c27ab8e85" (UID: "5ba19509-98fd-4ae4-b9ab-673c27ab8e85"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.481893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-logs" (OuterVolumeSpecName: "logs") pod "5ba19509-98fd-4ae4-b9ab-673c27ab8e85" (UID: "5ba19509-98fd-4ae4-b9ab-673c27ab8e85"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.485610 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-scripts" (OuterVolumeSpecName: "scripts") pod "5ba19509-98fd-4ae4-b9ab-673c27ab8e85" (UID: "5ba19509-98fd-4ae4-b9ab-673c27ab8e85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.485949 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-ceph" (OuterVolumeSpecName: "ceph") pod "5ba19509-98fd-4ae4-b9ab-673c27ab8e85" (UID: "5ba19509-98fd-4ae4-b9ab-673c27ab8e85"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.495561 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-kube-api-access-srnpd" (OuterVolumeSpecName: "kube-api-access-srnpd") pod "5ba19509-98fd-4ae4-b9ab-673c27ab8e85" (UID: "5ba19509-98fd-4ae4-b9ab-673c27ab8e85"). InnerVolumeSpecName "kube-api-access-srnpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.519197 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ba19509-98fd-4ae4-b9ab-673c27ab8e85" (UID: "5ba19509-98fd-4ae4-b9ab-673c27ab8e85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.547753 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-config-data" (OuterVolumeSpecName: "config-data") pod "5ba19509-98fd-4ae4-b9ab-673c27ab8e85" (UID: "5ba19509-98fd-4ae4-b9ab-673c27ab8e85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.568457 4795 scope.go:117] "RemoveContainer" containerID="f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648" Feb 19 23:04:49 crc kubenswrapper[4795]: E0219 23:04:49.569223 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648\": container with ID starting with f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648 not found: ID does not exist" containerID="f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.569262 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648"} err="failed to get container status \"f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648\": rpc error: code = NotFound desc = could not find container \"f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648\": container with ID starting with f4e052417461008a87fe32a59b836852c3e0dfbe3a7cafdcd8e66c0079a03648 not found: ID does not exist" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.569291 4795 scope.go:117] "RemoveContainer" containerID="e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58" Feb 19 23:04:49 crc kubenswrapper[4795]: E0219 23:04:49.569952 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58\": container with ID starting with e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58 not found: ID does not exist" containerID="e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.569985 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58"} err="failed to get container status \"e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58\": rpc error: code = NotFound desc = could not find container \"e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58\": container with ID starting with e79326eb305f46105ac73051eebe6a8a28eedc36e23a4992a65d558674bb4f58 not found: ID does not exist" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.581852 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.581880 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.581888 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.581899 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.581908 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srnpd\" (UniqueName: \"kubernetes.io/projected/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-kube-api-access-srnpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.581916 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.581925 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ba19509-98fd-4ae4-b9ab-673c27ab8e85-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.737223 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.747870 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.767467 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:49 crc kubenswrapper[4795]: E0219 23:04:49.768122 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerName="glance-log" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.768157 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerName="glance-log" Feb 19 23:04:49 crc kubenswrapper[4795]: E0219 23:04:49.768247 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerName="glance-httpd" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.768262 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerName="glance-httpd" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.768631 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerName="glance-log" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.768690 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" containerName="glance-httpd" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.770446 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.774791 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.788031 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.888685 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.889054 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.889233 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.889265 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.889327 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqsqt\" (UniqueName: \"kubernetes.io/projected/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-kube-api-access-bqsqt\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.889366 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-logs\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:49 crc kubenswrapper[4795]: I0219 23:04:49.889386 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-ceph\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.009100 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.009150 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.009202 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqsqt\" (UniqueName: \"kubernetes.io/projected/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-kube-api-access-bqsqt\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.009244 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-logs\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.009281 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-ceph\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.009326 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.009348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.016873 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.016911 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-logs\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.017571 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.023695 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-ceph\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.035398 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.036068 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqsqt\" (UniqueName: \"kubernetes.io/projected/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-kube-api-access-bqsqt\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.036744 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4ac280-c0e4-46e3-95c8-5e051c96f32e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd4ac280-c0e4-46e3-95c8-5e051c96f32e\") " pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.121718 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 23:04:50 crc kubenswrapper[4795]: I0219 23:04:50.512367 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:04:50 crc kubenswrapper[4795]: E0219 23:04:50.512583 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:04:51 crc kubenswrapper[4795]: I0219 23:04:51.526588 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ba19509-98fd-4ae4-b9ab-673c27ab8e85" path="/var/lib/kubelet/pods/5ba19509-98fd-4ae4-b9ab-673c27ab8e85/volumes" Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.795905 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.927798 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-httpd-run\") pod \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.927872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-combined-ca-bundle\") pod \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.927898 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-logs\") pod \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.927947 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljtbh\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-kube-api-access-ljtbh\") pod \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.928003 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-config-data\") pod \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.928077 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-ceph\") pod \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.928137 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-scripts\") pod \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\" (UID: \"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc\") " Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.939667 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-logs" (OuterVolumeSpecName: "logs") pod "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" (UID: "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.939906 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" (UID: "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.988447 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-scripts" (OuterVolumeSpecName: "scripts") pod "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" (UID: "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.988638 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-ceph" (OuterVolumeSpecName: "ceph") pod "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" (UID: "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:54 crc kubenswrapper[4795]: I0219 23:04:54.994573 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-kube-api-access-ljtbh" (OuterVolumeSpecName: "kube-api-access-ljtbh") pod "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" (UID: "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc"). InnerVolumeSpecName "kube-api-access-ljtbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.031559 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.031592 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.031603 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljtbh\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-kube-api-access-ljtbh\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.031614 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.031625 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.101340 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" (UID: "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.132789 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.165492 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-config-data" (OuterVolumeSpecName: "config-data") pod "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" (UID: "f24f09d2-d9ef-4930-b1ce-284e1a0e61cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.235678 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.331062 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.461029 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548bf4c685-852ql" event={"ID":"353e54a0-06cb-4876-af76-78bcd1bb3a22","Type":"ContainerStarted","Data":"1bd2543092bc2960614f9063a0b09c79b76c851d29dedd3bce8880af20588398"} Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.461451 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548bf4c685-852ql" event={"ID":"353e54a0-06cb-4876-af76-78bcd1bb3a22","Type":"ContainerStarted","Data":"b946e5db8d3a52db0f1f9054f0099688c365289ec5bd26ac448c71a734370a40"} Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.466048 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bb8b759c-wxdkj" event={"ID":"4ab2c6ea-997b-4147-a0de-5e3989980973","Type":"ContainerStarted","Data":"aba2ee7ae1998f0274365ca81f59652ac2f73a1ce0c84373e76ed4aca4b68378"} Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.466115 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bb8b759c-wxdkj" event={"ID":"4ab2c6ea-997b-4147-a0de-5e3989980973","Type":"ContainerStarted","Data":"b76880969cbea856154bb0e3062a422d765928c77fe52c693f3423f812d36b6f"} Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.466198 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79bb8b759c-wxdkj" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerName="horizon-log" containerID="cri-o://b76880969cbea856154bb0e3062a422d765928c77fe52c693f3423f812d36b6f" gracePeriod=30 Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.466261 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79bb8b759c-wxdkj" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerName="horizon" containerID="cri-o://aba2ee7ae1998f0274365ca81f59652ac2f73a1ce0c84373e76ed4aca4b68378" gracePeriod=30 Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.469534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c94496f-brkd8" event={"ID":"fb48bdd6-abf1-4115-8357-79c56555d51b","Type":"ContainerStarted","Data":"098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149"} Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.469596 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c94496f-brkd8" event={"ID":"fb48bdd6-abf1-4115-8357-79c56555d51b","Type":"ContainerStarted","Data":"66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd"} Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.472179 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f24f09d2-d9ef-4930-b1ce-284e1a0e61cc","Type":"ContainerDied","Data":"5b24832724357e9fe3cd524ab47d5c8ea237d7ac8d26ead3ec336002073cfe70"} Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.472243 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.472260 4795 scope.go:117] "RemoveContainer" containerID="1b16e4f15dda00c47ebd1d3a052f48e0eb759759e8c054d1aebe9ed9f2d0750b" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.474198 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd4ac280-c0e4-46e3-95c8-5e051c96f32e","Type":"ContainerStarted","Data":"8982d489f3ba63018dcae7db8988f20bf8ac06ffc172b54c1199a4804d4202c6"} Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.491522 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-548bf4c685-852ql" podStartSLOduration=2.005485808 podStartE2EDuration="9.491496029s" podCreationTimestamp="2026-02-19 23:04:46 +0000 UTC" firstStartedPulling="2026-02-19 23:04:47.238309064 +0000 UTC m=+5798.430826928" lastFinishedPulling="2026-02-19 23:04:54.724319245 +0000 UTC m=+5805.916837149" observedRunningTime="2026-02-19 23:04:55.485081337 +0000 UTC m=+5806.677599221" watchObservedRunningTime="2026-02-19 23:04:55.491496029 +0000 UTC m=+5806.684013893" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.516502 4795 scope.go:117] "RemoveContainer" containerID="2881a41d4e90cea66bb06cc15746e467948606aaeb4ab6378ba25eec8d520511" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.522939 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-844c94496f-brkd8" podStartSLOduration=2.519107114 podStartE2EDuration="10.522919492s" podCreationTimestamp="2026-02-19 23:04:45 +0000 UTC" firstStartedPulling="2026-02-19 23:04:46.717462131 +0000 UTC m=+5797.909979995" lastFinishedPulling="2026-02-19 23:04:54.721274499 +0000 UTC m=+5805.913792373" observedRunningTime="2026-02-19 23:04:55.509148791 +0000 UTC m=+5806.701666665" watchObservedRunningTime="2026-02-19 23:04:55.522919492 +0000 UTC m=+5806.715437356" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.536960 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-79bb8b759c-wxdkj" podStartSLOduration=2.239893859 podStartE2EDuration="10.536937351s" podCreationTimestamp="2026-02-19 23:04:45 +0000 UTC" firstStartedPulling="2026-02-19 23:04:46.40183764 +0000 UTC m=+5797.594355504" lastFinishedPulling="2026-02-19 23:04:54.698881112 +0000 UTC m=+5805.891398996" observedRunningTime="2026-02-19 23:04:55.534073199 +0000 UTC m=+5806.726591083" watchObservedRunningTime="2026-02-19 23:04:55.536937351 +0000 UTC m=+5806.729455215" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.583674 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.604460 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.614983 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:55 crc kubenswrapper[4795]: E0219 23:04:55.618139 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-httpd" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.618352 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-httpd" Feb 19 23:04:55 crc kubenswrapper[4795]: E0219 23:04:55.618462 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-log" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.618550 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-log" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.620150 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-httpd" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.620281 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" containerName="glance-log" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.625614 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.641527 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.646895 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.749351 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.749413 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.749441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.749470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-logs\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.749536 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrdch\" (UniqueName: \"kubernetes.io/projected/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-kube-api-access-wrdch\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.749557 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.749582 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.851783 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.852967 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrdch\" (UniqueName: \"kubernetes.io/projected/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-kube-api-access-wrdch\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.853114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.853253 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.853529 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.853705 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.853846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.853986 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-logs\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.854085 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.854360 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-logs\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.860015 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.862666 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.864350 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.870468 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:55 crc kubenswrapper[4795]: I0219 23:04:55.878877 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrdch\" (UniqueName: \"kubernetes.io/projected/b7d41b06-abb7-4a30-a29c-3b9d66706d8f-kube-api-access-wrdch\") pod \"glance-default-internal-api-0\" (UID: \"b7d41b06-abb7-4a30-a29c-3b9d66706d8f\") " pod="openstack/glance-default-internal-api-0" Feb 19 23:04:56 crc kubenswrapper[4795]: I0219 23:04:56.016283 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 23:04:56 crc kubenswrapper[4795]: I0219 23:04:56.108715 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:56 crc kubenswrapper[4795]: I0219 23:04:56.108765 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:04:56 crc kubenswrapper[4795]: I0219 23:04:56.491397 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd4ac280-c0e4-46e3-95c8-5e051c96f32e","Type":"ContainerStarted","Data":"31fabd4d7a61a9a58d9ad9aa5dc3005e8d81ec5a949d4fa86467ff0dd972f904"} Feb 19 23:04:56 crc kubenswrapper[4795]: I0219 23:04:56.628221 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 23:04:56 crc kubenswrapper[4795]: I0219 23:04:56.749956 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:56 crc kubenswrapper[4795]: I0219 23:04:56.751926 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:04:57 crc kubenswrapper[4795]: I0219 23:04:57.501972 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b7d41b06-abb7-4a30-a29c-3b9d66706d8f","Type":"ContainerStarted","Data":"d7187e932a3c630957fc932dcb0516a262cbd3e096ff9d008d16d55591e9d7a2"} Feb 19 23:04:57 crc kubenswrapper[4795]: I0219 23:04:57.502404 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b7d41b06-abb7-4a30-a29c-3b9d66706d8f","Type":"ContainerStarted","Data":"0830d7fc6b4ddd1fb3f0c43eedd5e0ef6164946ae562a47ceef1d3553930bb86"} Feb 19 23:04:57 crc kubenswrapper[4795]: I0219 23:04:57.505728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd4ac280-c0e4-46e3-95c8-5e051c96f32e","Type":"ContainerStarted","Data":"1d9e33caaf030e67fdfb0c6e7e0023ecd6c71ed6e4784f944811f7b33d74d1c4"} Feb 19 23:04:57 crc kubenswrapper[4795]: I0219 23:04:57.528431 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f24f09d2-d9ef-4930-b1ce-284e1a0e61cc" path="/var/lib/kubelet/pods/f24f09d2-d9ef-4930-b1ce-284e1a0e61cc/volumes" Feb 19 23:04:57 crc kubenswrapper[4795]: I0219 23:04:57.536000 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.535981636 podStartE2EDuration="8.535981636s" podCreationTimestamp="2026-02-19 23:04:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:04:57.531624492 +0000 UTC m=+5808.724142456" watchObservedRunningTime="2026-02-19 23:04:57.535981636 +0000 UTC m=+5808.728499500" Feb 19 23:04:58 crc kubenswrapper[4795]: I0219 23:04:58.529499 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b7d41b06-abb7-4a30-a29c-3b9d66706d8f","Type":"ContainerStarted","Data":"f28f0381bb04b7582c79e7da10be4145ec5a60513881210191f53ef170db2ca4"} Feb 19 23:04:58 crc kubenswrapper[4795]: I0219 23:04:58.572191 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.572145215 podStartE2EDuration="3.572145215s" podCreationTimestamp="2026-02-19 23:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:04:58.550414237 +0000 UTC m=+5809.742932111" watchObservedRunningTime="2026-02-19 23:04:58.572145215 +0000 UTC m=+5809.764663099" Feb 19 23:05:00 crc kubenswrapper[4795]: I0219 23:05:00.122991 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 23:05:00 crc kubenswrapper[4795]: I0219 23:05:00.124577 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 23:05:00 crc kubenswrapper[4795]: I0219 23:05:00.178725 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 23:05:00 crc kubenswrapper[4795]: I0219 23:05:00.184086 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 23:05:00 crc kubenswrapper[4795]: I0219 23:05:00.545857 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 23:05:00 crc kubenswrapper[4795]: I0219 23:05:00.545908 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 23:05:01 crc kubenswrapper[4795]: I0219 23:05:01.524415 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:05:01 crc kubenswrapper[4795]: E0219 23:05:01.525244 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:05:03 crc kubenswrapper[4795]: I0219 23:05:03.537758 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 23:05:03 crc kubenswrapper[4795]: I0219 23:05:03.572001 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 23:05:04 crc kubenswrapper[4795]: I0219 23:05:04.047173 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-v9rbk"] Feb 19 23:05:04 crc kubenswrapper[4795]: I0219 23:05:04.062649 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b1d9-account-create-update-bfmjz"] Feb 19 23:05:04 crc kubenswrapper[4795]: I0219 23:05:04.071371 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b1d9-account-create-update-bfmjz"] Feb 19 23:05:04 crc kubenswrapper[4795]: I0219 23:05:04.079745 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-v9rbk"] Feb 19 23:05:05 crc kubenswrapper[4795]: I0219 23:05:05.523358 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4ca6125-46fa-4dd9-8d20-3816b6c09066" path="/var/lib/kubelet/pods/a4ca6125-46fa-4dd9-8d20-3816b6c09066/volumes" Feb 19 23:05:05 crc kubenswrapper[4795]: I0219 23:05:05.525412 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c13f05e4-27de-4750-bb9d-008e3a0be0c7" path="/var/lib/kubelet/pods/c13f05e4-27de-4750-bb9d-008e3a0be0c7/volumes" Feb 19 23:05:06 crc kubenswrapper[4795]: I0219 23:05:06.017446 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 23:05:06 crc kubenswrapper[4795]: I0219 23:05:06.018698 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 23:05:06 crc kubenswrapper[4795]: I0219 23:05:06.071748 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 23:05:06 crc kubenswrapper[4795]: I0219 23:05:06.076391 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 23:05:06 crc kubenswrapper[4795]: I0219 23:05:06.112700 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-844c94496f-brkd8" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Feb 19 23:05:06 crc kubenswrapper[4795]: I0219 23:05:06.602828 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 23:05:06 crc kubenswrapper[4795]: I0219 23:05:06.602875 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 23:05:06 crc kubenswrapper[4795]: I0219 23:05:06.750285 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-548bf4c685-852ql" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.113:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8080: connect: connection refused" Feb 19 23:05:08 crc kubenswrapper[4795]: I0219 23:05:08.640850 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 23:05:08 crc kubenswrapper[4795]: I0219 23:05:08.641316 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 23:05:08 crc kubenswrapper[4795]: I0219 23:05:08.732509 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 23:05:10 crc kubenswrapper[4795]: I0219 23:05:10.033861 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-f5h94"] Feb 19 23:05:10 crc kubenswrapper[4795]: I0219 23:05:10.045142 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-f5h94"] Feb 19 23:05:11 crc kubenswrapper[4795]: I0219 23:05:11.533231 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace73a97-1b52-4187-a035-df7a08266bab" path="/var/lib/kubelet/pods/ace73a97-1b52-4187-a035-df7a08266bab/volumes" Feb 19 23:05:16 crc kubenswrapper[4795]: I0219 23:05:16.512271 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:05:16 crc kubenswrapper[4795]: E0219 23:05:16.512884 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:05:16 crc kubenswrapper[4795]: I0219 23:05:16.688497 4795 scope.go:117] "RemoveContainer" containerID="b9f475bcb8fe7daa110726dc68979a5f8257c170cc549dfc5c151f5b0ece628f" Feb 19 23:05:16 crc kubenswrapper[4795]: I0219 23:05:16.731456 4795 scope.go:117] "RemoveContainer" containerID="4cb474362c7411d131561c37cda83e0ffd7519207d531dd518914dff39a66f87" Feb 19 23:05:16 crc kubenswrapper[4795]: I0219 23:05:16.760583 4795 scope.go:117] "RemoveContainer" containerID="c70a8662daa881e0007310348920cbd44bed7e794700942de323e6f34a2f57fc" Feb 19 23:05:16 crc kubenswrapper[4795]: I0219 23:05:16.789439 4795 scope.go:117] "RemoveContainer" containerID="1ad516f68056dfaa3dffe45049adde7607d436761c153d38593aeeda7b4036af" Feb 19 23:05:16 crc kubenswrapper[4795]: I0219 23:05:16.871832 4795 scope.go:117] "RemoveContainer" containerID="88b5a89b19e9675d8f8f4f6be28cd30648da8baae5b54e90d50ee586416168ce" Feb 19 23:05:16 crc kubenswrapper[4795]: I0219 23:05:16.911037 4795 scope.go:117] "RemoveContainer" containerID="b1d9f8ec39116ffdbb4a902a6f66f52cc6253d77fc5e2ba0e18cacfaee7d6753" Feb 19 23:05:17 crc kubenswrapper[4795]: I0219 23:05:17.884571 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:05:18 crc kubenswrapper[4795]: I0219 23:05:18.576649 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:05:19 crc kubenswrapper[4795]: I0219 23:05:19.494910 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:05:20 crc kubenswrapper[4795]: I0219 23:05:20.253952 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:05:20 crc kubenswrapper[4795]: I0219 23:05:20.356610 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-844c94496f-brkd8"] Feb 19 23:05:20 crc kubenswrapper[4795]: I0219 23:05:20.356831 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-844c94496f-brkd8" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon-log" containerID="cri-o://66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd" gracePeriod=30 Feb 19 23:05:20 crc kubenswrapper[4795]: I0219 23:05:20.357350 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-844c94496f-brkd8" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon" containerID="cri-o://098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149" gracePeriod=30 Feb 19 23:05:23 crc kubenswrapper[4795]: I0219 23:05:23.765890 4795 generic.go:334] "Generic (PLEG): container finished" podID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerID="098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149" exitCode=0 Feb 19 23:05:23 crc kubenswrapper[4795]: I0219 23:05:23.766020 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c94496f-brkd8" event={"ID":"fb48bdd6-abf1-4115-8357-79c56555d51b","Type":"ContainerDied","Data":"098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149"} Feb 19 23:05:25 crc kubenswrapper[4795]: I0219 23:05:25.787650 4795 generic.go:334] "Generic (PLEG): container finished" podID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerID="aba2ee7ae1998f0274365ca81f59652ac2f73a1ce0c84373e76ed4aca4b68378" exitCode=137 Feb 19 23:05:25 crc kubenswrapper[4795]: I0219 23:05:25.788178 4795 generic.go:334] "Generic (PLEG): container finished" podID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerID="b76880969cbea856154bb0e3062a422d765928c77fe52c693f3423f812d36b6f" exitCode=137 Feb 19 23:05:25 crc kubenswrapper[4795]: I0219 23:05:25.788203 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bb8b759c-wxdkj" event={"ID":"4ab2c6ea-997b-4147-a0de-5e3989980973","Type":"ContainerDied","Data":"aba2ee7ae1998f0274365ca81f59652ac2f73a1ce0c84373e76ed4aca4b68378"} Feb 19 23:05:25 crc kubenswrapper[4795]: I0219 23:05:25.788231 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bb8b759c-wxdkj" event={"ID":"4ab2c6ea-997b-4147-a0de-5e3989980973","Type":"ContainerDied","Data":"b76880969cbea856154bb0e3062a422d765928c77fe52c693f3423f812d36b6f"} Feb 19 23:05:25 crc kubenswrapper[4795]: I0219 23:05:25.889036 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.065674 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-scripts\") pod \"4ab2c6ea-997b-4147-a0de-5e3989980973\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.065741 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwmwt\" (UniqueName: \"kubernetes.io/projected/4ab2c6ea-997b-4147-a0de-5e3989980973-kube-api-access-hwmwt\") pod \"4ab2c6ea-997b-4147-a0de-5e3989980973\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.065804 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4ab2c6ea-997b-4147-a0de-5e3989980973-horizon-secret-key\") pod \"4ab2c6ea-997b-4147-a0de-5e3989980973\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.065883 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-config-data\") pod \"4ab2c6ea-997b-4147-a0de-5e3989980973\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.065902 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab2c6ea-997b-4147-a0de-5e3989980973-logs\") pod \"4ab2c6ea-997b-4147-a0de-5e3989980973\" (UID: \"4ab2c6ea-997b-4147-a0de-5e3989980973\") " Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.066749 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ab2c6ea-997b-4147-a0de-5e3989980973-logs" (OuterVolumeSpecName: "logs") pod "4ab2c6ea-997b-4147-a0de-5e3989980973" (UID: "4ab2c6ea-997b-4147-a0de-5e3989980973"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.071240 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ab2c6ea-997b-4147-a0de-5e3989980973-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4ab2c6ea-997b-4147-a0de-5e3989980973" (UID: "4ab2c6ea-997b-4147-a0de-5e3989980973"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.071259 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ab2c6ea-997b-4147-a0de-5e3989980973-kube-api-access-hwmwt" (OuterVolumeSpecName: "kube-api-access-hwmwt") pod "4ab2c6ea-997b-4147-a0de-5e3989980973" (UID: "4ab2c6ea-997b-4147-a0de-5e3989980973"). InnerVolumeSpecName "kube-api-access-hwmwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.090187 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-scripts" (OuterVolumeSpecName: "scripts") pod "4ab2c6ea-997b-4147-a0de-5e3989980973" (UID: "4ab2c6ea-997b-4147-a0de-5e3989980973"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.098232 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-config-data" (OuterVolumeSpecName: "config-data") pod "4ab2c6ea-997b-4147-a0de-5e3989980973" (UID: "4ab2c6ea-997b-4147-a0de-5e3989980973"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.110115 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-844c94496f-brkd8" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.168560 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4ab2c6ea-997b-4147-a0de-5e3989980973-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.168601 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.168611 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ab2c6ea-997b-4147-a0de-5e3989980973-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.168672 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ab2c6ea-997b-4147-a0de-5e3989980973-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.168686 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwmwt\" (UniqueName: \"kubernetes.io/projected/4ab2c6ea-997b-4147-a0de-5e3989980973-kube-api-access-hwmwt\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.797321 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bb8b759c-wxdkj" event={"ID":"4ab2c6ea-997b-4147-a0de-5e3989980973","Type":"ContainerDied","Data":"79805600d19571dc97c2bb6f4d14290da224c25127f75720d4481d84a77147c0"} Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.797704 4795 scope.go:117] "RemoveContainer" containerID="aba2ee7ae1998f0274365ca81f59652ac2f73a1ce0c84373e76ed4aca4b68378" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.797389 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79bb8b759c-wxdkj" Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.832707 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79bb8b759c-wxdkj"] Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.841900 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-79bb8b759c-wxdkj"] Feb 19 23:05:26 crc kubenswrapper[4795]: I0219 23:05:26.968817 4795 scope.go:117] "RemoveContainer" containerID="b76880969cbea856154bb0e3062a422d765928c77fe52c693f3423f812d36b6f" Feb 19 23:05:27 crc kubenswrapper[4795]: I0219 23:05:27.524382 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" path="/var/lib/kubelet/pods/4ab2c6ea-997b-4147-a0de-5e3989980973/volumes" Feb 19 23:05:31 crc kubenswrapper[4795]: I0219 23:05:31.511814 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:05:31 crc kubenswrapper[4795]: E0219 23:05:31.515026 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:05:36 crc kubenswrapper[4795]: I0219 23:05:36.109663 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-844c94496f-brkd8" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Feb 19 23:05:40 crc kubenswrapper[4795]: I0219 23:05:40.048922 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f69d-account-create-update-gbq6r"] Feb 19 23:05:40 crc kubenswrapper[4795]: I0219 23:05:40.061511 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-xbwb6"] Feb 19 23:05:40 crc kubenswrapper[4795]: I0219 23:05:40.070671 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f69d-account-create-update-gbq6r"] Feb 19 23:05:40 crc kubenswrapper[4795]: I0219 23:05:40.080036 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-xbwb6"] Feb 19 23:05:41 crc kubenswrapper[4795]: I0219 23:05:41.523927 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="635044d2-10e8-457c-b03e-9507a500c7fe" path="/var/lib/kubelet/pods/635044d2-10e8-457c-b03e-9507a500c7fe/volumes" Feb 19 23:05:41 crc kubenswrapper[4795]: I0219 23:05:41.525292 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0369c6f-517b-44b8-968a-a3408c6044d6" path="/var/lib/kubelet/pods/c0369c6f-517b-44b8-968a-a3408c6044d6/volumes" Feb 19 23:05:42 crc kubenswrapper[4795]: I0219 23:05:42.511827 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:05:42 crc kubenswrapper[4795]: E0219 23:05:42.512481 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:05:46 crc kubenswrapper[4795]: I0219 23:05:46.109570 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-844c94496f-brkd8" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Feb 19 23:05:46 crc kubenswrapper[4795]: I0219 23:05:46.110016 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:05:48 crc kubenswrapper[4795]: I0219 23:05:48.078004 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-p5rjh"] Feb 19 23:05:48 crc kubenswrapper[4795]: I0219 23:05:48.088643 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-p5rjh"] Feb 19 23:05:49 crc kubenswrapper[4795]: I0219 23:05:49.535320 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ad0e107-d857-4118-9582-5039b45f1ec8" path="/var/lib/kubelet/pods/9ad0e107-d857-4118-9582-5039b45f1ec8/volumes" Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.853756 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.977394 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fb48bdd6-abf1-4115-8357-79c56555d51b-horizon-secret-key\") pod \"fb48bdd6-abf1-4115-8357-79c56555d51b\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.977454 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-scripts\") pod \"fb48bdd6-abf1-4115-8357-79c56555d51b\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.977546 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6xf7\" (UniqueName: \"kubernetes.io/projected/fb48bdd6-abf1-4115-8357-79c56555d51b-kube-api-access-p6xf7\") pod \"fb48bdd6-abf1-4115-8357-79c56555d51b\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.977611 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-config-data\") pod \"fb48bdd6-abf1-4115-8357-79c56555d51b\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.977680 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb48bdd6-abf1-4115-8357-79c56555d51b-logs\") pod \"fb48bdd6-abf1-4115-8357-79c56555d51b\" (UID: \"fb48bdd6-abf1-4115-8357-79c56555d51b\") " Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.978134 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb48bdd6-abf1-4115-8357-79c56555d51b-logs" (OuterVolumeSpecName: "logs") pod "fb48bdd6-abf1-4115-8357-79c56555d51b" (UID: "fb48bdd6-abf1-4115-8357-79c56555d51b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.978718 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb48bdd6-abf1-4115-8357-79c56555d51b-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.982868 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb48bdd6-abf1-4115-8357-79c56555d51b-kube-api-access-p6xf7" (OuterVolumeSpecName: "kube-api-access-p6xf7") pod "fb48bdd6-abf1-4115-8357-79c56555d51b" (UID: "fb48bdd6-abf1-4115-8357-79c56555d51b"). InnerVolumeSpecName "kube-api-access-p6xf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:05:50 crc kubenswrapper[4795]: I0219 23:05:50.982893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb48bdd6-abf1-4115-8357-79c56555d51b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fb48bdd6-abf1-4115-8357-79c56555d51b" (UID: "fb48bdd6-abf1-4115-8357-79c56555d51b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.001749 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-config-data" (OuterVolumeSpecName: "config-data") pod "fb48bdd6-abf1-4115-8357-79c56555d51b" (UID: "fb48bdd6-abf1-4115-8357-79c56555d51b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.003904 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-scripts" (OuterVolumeSpecName: "scripts") pod "fb48bdd6-abf1-4115-8357-79c56555d51b" (UID: "fb48bdd6-abf1-4115-8357-79c56555d51b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.042692 4795 generic.go:334] "Generic (PLEG): container finished" podID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerID="66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd" exitCode=137 Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.042785 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c94496f-brkd8" event={"ID":"fb48bdd6-abf1-4115-8357-79c56555d51b","Type":"ContainerDied","Data":"66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd"} Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.043058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c94496f-brkd8" event={"ID":"fb48bdd6-abf1-4115-8357-79c56555d51b","Type":"ContainerDied","Data":"a28eb35b4d570794611599731f49604cc11a5fca584746030e04c1f904b4cff5"} Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.043127 4795 scope.go:117] "RemoveContainer" containerID="098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.042815 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844c94496f-brkd8" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.080682 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.080994 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fb48bdd6-abf1-4115-8357-79c56555d51b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.081009 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb48bdd6-abf1-4115-8357-79c56555d51b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.081021 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6xf7\" (UniqueName: \"kubernetes.io/projected/fb48bdd6-abf1-4115-8357-79c56555d51b-kube-api-access-p6xf7\") on node \"crc\" DevicePath \"\"" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.081769 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-844c94496f-brkd8"] Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.091305 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-844c94496f-brkd8"] Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.235059 4795 scope.go:117] "RemoveContainer" containerID="66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.257452 4795 scope.go:117] "RemoveContainer" containerID="098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149" Feb 19 23:05:51 crc kubenswrapper[4795]: E0219 23:05:51.257936 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149\": container with ID starting with 098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149 not found: ID does not exist" containerID="098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.257980 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149"} err="failed to get container status \"098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149\": rpc error: code = NotFound desc = could not find container \"098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149\": container with ID starting with 098124b166de0fbf41c5cb05a877f55202dbab6d10c56f71e5220e37b6da5149 not found: ID does not exist" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.258006 4795 scope.go:117] "RemoveContainer" containerID="66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd" Feb 19 23:05:51 crc kubenswrapper[4795]: E0219 23:05:51.258484 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd\": container with ID starting with 66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd not found: ID does not exist" containerID="66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.258516 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd"} err="failed to get container status \"66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd\": rpc error: code = NotFound desc = could not find container \"66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd\": container with ID starting with 66e07672d0b0039c92e5a7bf2db03f18fc25ea4f8132e4dd9bc20c658abaf8bd not found: ID does not exist" Feb 19 23:05:51 crc kubenswrapper[4795]: I0219 23:05:51.528940 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" path="/var/lib/kubelet/pods/fb48bdd6-abf1-4115-8357-79c56555d51b/volumes" Feb 19 23:05:55 crc kubenswrapper[4795]: I0219 23:05:55.512219 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:05:55 crc kubenswrapper[4795]: E0219 23:05:55.514062 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.372690 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f75767dd9-c8js2"] Feb 19 23:06:03 crc kubenswrapper[4795]: E0219 23:06:03.373676 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerName="horizon" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.373695 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerName="horizon" Feb 19 23:06:03 crc kubenswrapper[4795]: E0219 23:06:03.373724 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.373732 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon" Feb 19 23:06:03 crc kubenswrapper[4795]: E0219 23:06:03.373751 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerName="horizon-log" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.373759 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerName="horizon-log" Feb 19 23:06:03 crc kubenswrapper[4795]: E0219 23:06:03.373792 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon-log" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.373799 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon-log" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.374008 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerName="horizon-log" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.374035 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.374049 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab2c6ea-997b-4147-a0de-5e3989980973" containerName="horizon" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.374069 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb48bdd6-abf1-4115-8357-79c56555d51b" containerName="horizon-log" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.375376 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.391548 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f75767dd9-c8js2"] Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.513981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvzlz\" (UniqueName: \"kubernetes.io/projected/53ce70ba-9e61-4dbd-b858-7059c82eed67-kube-api-access-hvzlz\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.514415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53ce70ba-9e61-4dbd-b858-7059c82eed67-scripts\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.514455 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/53ce70ba-9e61-4dbd-b858-7059c82eed67-horizon-secret-key\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.514518 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53ce70ba-9e61-4dbd-b858-7059c82eed67-logs\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.514602 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53ce70ba-9e61-4dbd-b858-7059c82eed67-config-data\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.616193 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53ce70ba-9e61-4dbd-b858-7059c82eed67-scripts\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.616304 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/53ce70ba-9e61-4dbd-b858-7059c82eed67-horizon-secret-key\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.616380 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53ce70ba-9e61-4dbd-b858-7059c82eed67-logs\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.616444 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53ce70ba-9e61-4dbd-b858-7059c82eed67-config-data\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.616545 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvzlz\" (UniqueName: \"kubernetes.io/projected/53ce70ba-9e61-4dbd-b858-7059c82eed67-kube-api-access-hvzlz\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.617766 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53ce70ba-9e61-4dbd-b858-7059c82eed67-logs\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.617897 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53ce70ba-9e61-4dbd-b858-7059c82eed67-scripts\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.619434 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53ce70ba-9e61-4dbd-b858-7059c82eed67-config-data\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.633078 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/53ce70ba-9e61-4dbd-b858-7059c82eed67-horizon-secret-key\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.633237 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvzlz\" (UniqueName: \"kubernetes.io/projected/53ce70ba-9e61-4dbd-b858-7059c82eed67-kube-api-access-hvzlz\") pod \"horizon-6f75767dd9-c8js2\" (UID: \"53ce70ba-9e61-4dbd-b858-7059c82eed67\") " pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:03 crc kubenswrapper[4795]: I0219 23:06:03.700702 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:04 crc kubenswrapper[4795]: I0219 23:06:04.162506 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f75767dd9-c8js2"] Feb 19 23:06:04 crc kubenswrapper[4795]: I0219 23:06:04.179885 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f75767dd9-c8js2" event={"ID":"53ce70ba-9e61-4dbd-b858-7059c82eed67","Type":"ContainerStarted","Data":"2ea9022ede0c3e50baa39c2347e5ce3c1030986712d1ff78b282a6219767af88"} Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.005101 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-pvsmv"] Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.007393 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.018320 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-pvsmv"] Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.109938 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-4351-account-create-update-7cp54"] Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.111715 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.114455 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.119584 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-4351-account-create-update-7cp54"] Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.149521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-operator-scripts\") pod \"heat-db-create-pvsmv\" (UID: \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\") " pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.149616 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5znf\" (UniqueName: \"kubernetes.io/projected/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-kube-api-access-w5znf\") pod \"heat-db-create-pvsmv\" (UID: \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\") " pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.196483 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f75767dd9-c8js2" event={"ID":"53ce70ba-9e61-4dbd-b858-7059c82eed67","Type":"ContainerStarted","Data":"1f44a51809ccf2a01357de51840ced27533940ee96c83e26d97d8fba5a6f387b"} Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.196527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f75767dd9-c8js2" event={"ID":"53ce70ba-9e61-4dbd-b858-7059c82eed67","Type":"ContainerStarted","Data":"dd4dcaf29f9033593907f9d3d6a4d67d87ce9f60ff1b51bf67c29e49681ba566"} Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.218762 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6f75767dd9-c8js2" podStartSLOduration=2.2187418819999998 podStartE2EDuration="2.218741882s" podCreationTimestamp="2026-02-19 23:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:05.210888393 +0000 UTC m=+5876.403406277" watchObservedRunningTime="2026-02-19 23:06:05.218741882 +0000 UTC m=+5876.411259746" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.251532 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm24c\" (UniqueName: \"kubernetes.io/projected/c7049350-2c57-49c2-aef7-b9f0bd28abfc-kube-api-access-dm24c\") pod \"heat-4351-account-create-update-7cp54\" (UID: \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\") " pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.251592 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5znf\" (UniqueName: \"kubernetes.io/projected/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-kube-api-access-w5znf\") pod \"heat-db-create-pvsmv\" (UID: \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\") " pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.251677 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7049350-2c57-49c2-aef7-b9f0bd28abfc-operator-scripts\") pod \"heat-4351-account-create-update-7cp54\" (UID: \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\") " pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.251937 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-operator-scripts\") pod \"heat-db-create-pvsmv\" (UID: \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\") " pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.252691 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-operator-scripts\") pod \"heat-db-create-pvsmv\" (UID: \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\") " pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.275685 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5znf\" (UniqueName: \"kubernetes.io/projected/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-kube-api-access-w5znf\") pod \"heat-db-create-pvsmv\" (UID: \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\") " pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.328147 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.353559 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm24c\" (UniqueName: \"kubernetes.io/projected/c7049350-2c57-49c2-aef7-b9f0bd28abfc-kube-api-access-dm24c\") pod \"heat-4351-account-create-update-7cp54\" (UID: \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\") " pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.353631 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7049350-2c57-49c2-aef7-b9f0bd28abfc-operator-scripts\") pod \"heat-4351-account-create-update-7cp54\" (UID: \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\") " pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.354621 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7049350-2c57-49c2-aef7-b9f0bd28abfc-operator-scripts\") pod \"heat-4351-account-create-update-7cp54\" (UID: \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\") " pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.379721 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm24c\" (UniqueName: \"kubernetes.io/projected/c7049350-2c57-49c2-aef7-b9f0bd28abfc-kube-api-access-dm24c\") pod \"heat-4351-account-create-update-7cp54\" (UID: \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\") " pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:05 crc kubenswrapper[4795]: I0219 23:06:05.428671 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:06 crc kubenswrapper[4795]: I0219 23:06:06.385845 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-pvsmv"] Feb 19 23:06:06 crc kubenswrapper[4795]: W0219 23:06:06.446599 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7049350_2c57_49c2_aef7_b9f0bd28abfc.slice/crio-f7fd4e88e17a64a3d63d66068c160f61502e2f8de22f456fb4f0e646af6f9a6f WatchSource:0}: Error finding container f7fd4e88e17a64a3d63d66068c160f61502e2f8de22f456fb4f0e646af6f9a6f: Status 404 returned error can't find the container with id f7fd4e88e17a64a3d63d66068c160f61502e2f8de22f456fb4f0e646af6f9a6f Feb 19 23:06:06 crc kubenswrapper[4795]: I0219 23:06:06.447665 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-4351-account-create-update-7cp54"] Feb 19 23:06:07 crc kubenswrapper[4795]: I0219 23:06:07.220516 4795 generic.go:334] "Generic (PLEG): container finished" podID="3ce3f6fb-8688-4e53-8d30-e6c7edbf5636" containerID="394778543b7d586f5d57eb0c386c1afcba774c0cbc8858b3c036d9d786189525" exitCode=0 Feb 19 23:06:07 crc kubenswrapper[4795]: I0219 23:06:07.220556 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-pvsmv" event={"ID":"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636","Type":"ContainerDied","Data":"394778543b7d586f5d57eb0c386c1afcba774c0cbc8858b3c036d9d786189525"} Feb 19 23:06:07 crc kubenswrapper[4795]: I0219 23:06:07.220944 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-pvsmv" event={"ID":"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636","Type":"ContainerStarted","Data":"83fe7779207b6996d4e3030b5609b7839791250e025c39eae68cc633d632e6ed"} Feb 19 23:06:07 crc kubenswrapper[4795]: I0219 23:06:07.223348 4795 generic.go:334] "Generic (PLEG): container finished" podID="c7049350-2c57-49c2-aef7-b9f0bd28abfc" containerID="be3b7d10ee8dbba79631201a8d5da4057d99e4383e860152ba77db4992ff52fc" exitCode=0 Feb 19 23:06:07 crc kubenswrapper[4795]: I0219 23:06:07.223417 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4351-account-create-update-7cp54" event={"ID":"c7049350-2c57-49c2-aef7-b9f0bd28abfc","Type":"ContainerDied","Data":"be3b7d10ee8dbba79631201a8d5da4057d99e4383e860152ba77db4992ff52fc"} Feb 19 23:06:07 crc kubenswrapper[4795]: I0219 23:06:07.223443 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4351-account-create-update-7cp54" event={"ID":"c7049350-2c57-49c2-aef7-b9f0bd28abfc","Type":"ContainerStarted","Data":"f7fd4e88e17a64a3d63d66068c160f61502e2f8de22f456fb4f0e646af6f9a6f"} Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.512342 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:06:08 crc kubenswrapper[4795]: E0219 23:06:08.513070 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.671517 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.678060 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.718046 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-operator-scripts\") pod \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\" (UID: \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\") " Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.718428 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5znf\" (UniqueName: \"kubernetes.io/projected/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-kube-api-access-w5znf\") pod \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\" (UID: \"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636\") " Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.718859 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ce3f6fb-8688-4e53-8d30-e6c7edbf5636" (UID: "3ce3f6fb-8688-4e53-8d30-e6c7edbf5636"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.719020 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.723500 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-kube-api-access-w5znf" (OuterVolumeSpecName: "kube-api-access-w5znf") pod "3ce3f6fb-8688-4e53-8d30-e6c7edbf5636" (UID: "3ce3f6fb-8688-4e53-8d30-e6c7edbf5636"). InnerVolumeSpecName "kube-api-access-w5znf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.820550 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm24c\" (UniqueName: \"kubernetes.io/projected/c7049350-2c57-49c2-aef7-b9f0bd28abfc-kube-api-access-dm24c\") pod \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\" (UID: \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\") " Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.821388 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7049350-2c57-49c2-aef7-b9f0bd28abfc-operator-scripts\") pod \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\" (UID: \"c7049350-2c57-49c2-aef7-b9f0bd28abfc\") " Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.821857 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7049350-2c57-49c2-aef7-b9f0bd28abfc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7049350-2c57-49c2-aef7-b9f0bd28abfc" (UID: "c7049350-2c57-49c2-aef7-b9f0bd28abfc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.822772 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5znf\" (UniqueName: \"kubernetes.io/projected/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636-kube-api-access-w5znf\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.822962 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7049350-2c57-49c2-aef7-b9f0bd28abfc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.823846 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7049350-2c57-49c2-aef7-b9f0bd28abfc-kube-api-access-dm24c" (OuterVolumeSpecName: "kube-api-access-dm24c") pod "c7049350-2c57-49c2-aef7-b9f0bd28abfc" (UID: "c7049350-2c57-49c2-aef7-b9f0bd28abfc"). InnerVolumeSpecName "kube-api-access-dm24c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:08 crc kubenswrapper[4795]: I0219 23:06:08.925944 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm24c\" (UniqueName: \"kubernetes.io/projected/c7049350-2c57-49c2-aef7-b9f0bd28abfc-kube-api-access-dm24c\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:09 crc kubenswrapper[4795]: I0219 23:06:09.247430 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4351-account-create-update-7cp54" event={"ID":"c7049350-2c57-49c2-aef7-b9f0bd28abfc","Type":"ContainerDied","Data":"f7fd4e88e17a64a3d63d66068c160f61502e2f8de22f456fb4f0e646af6f9a6f"} Feb 19 23:06:09 crc kubenswrapper[4795]: I0219 23:06:09.247483 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7fd4e88e17a64a3d63d66068c160f61502e2f8de22f456fb4f0e646af6f9a6f" Feb 19 23:06:09 crc kubenswrapper[4795]: I0219 23:06:09.247559 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4351-account-create-update-7cp54" Feb 19 23:06:09 crc kubenswrapper[4795]: I0219 23:06:09.252957 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-pvsmv" event={"ID":"3ce3f6fb-8688-4e53-8d30-e6c7edbf5636","Type":"ContainerDied","Data":"83fe7779207b6996d4e3030b5609b7839791250e025c39eae68cc633d632e6ed"} Feb 19 23:06:09 crc kubenswrapper[4795]: I0219 23:06:09.252998 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83fe7779207b6996d4e3030b5609b7839791250e025c39eae68cc633d632e6ed" Feb 19 23:06:09 crc kubenswrapper[4795]: I0219 23:06:09.253048 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-pvsmv" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.326743 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-wlhqm"] Feb 19 23:06:10 crc kubenswrapper[4795]: E0219 23:06:10.327438 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce3f6fb-8688-4e53-8d30-e6c7edbf5636" containerName="mariadb-database-create" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.327449 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce3f6fb-8688-4e53-8d30-e6c7edbf5636" containerName="mariadb-database-create" Feb 19 23:06:10 crc kubenswrapper[4795]: E0219 23:06:10.327463 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7049350-2c57-49c2-aef7-b9f0bd28abfc" containerName="mariadb-account-create-update" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.327468 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7049350-2c57-49c2-aef7-b9f0bd28abfc" containerName="mariadb-account-create-update" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.327643 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7049350-2c57-49c2-aef7-b9f0bd28abfc" containerName="mariadb-account-create-update" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.327657 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce3f6fb-8688-4e53-8d30-e6c7edbf5636" containerName="mariadb-database-create" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.329630 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.335705 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-67jp9" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.335731 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.347884 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-wlhqm"] Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.461532 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf4f6\" (UniqueName: \"kubernetes.io/projected/497c4c82-13ae-430c-83bd-1f1c4d4683e4-kube-api-access-nf4f6\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.461788 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-combined-ca-bundle\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.461833 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-config-data\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.563923 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf4f6\" (UniqueName: \"kubernetes.io/projected/497c4c82-13ae-430c-83bd-1f1c4d4683e4-kube-api-access-nf4f6\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.564053 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-combined-ca-bundle\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.564076 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-config-data\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.568994 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-combined-ca-bundle\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.581239 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-config-data\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.581360 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf4f6\" (UniqueName: \"kubernetes.io/projected/497c4c82-13ae-430c-83bd-1f1c4d4683e4-kube-api-access-nf4f6\") pod \"heat-db-sync-wlhqm\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:10 crc kubenswrapper[4795]: I0219 23:06:10.664791 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:11 crc kubenswrapper[4795]: I0219 23:06:11.124614 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-wlhqm"] Feb 19 23:06:11 crc kubenswrapper[4795]: W0219 23:06:11.125123 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod497c4c82_13ae_430c_83bd_1f1c4d4683e4.slice/crio-1f14b1d9f8ec956e2f15e91e3acd1feb1bf1a9e896176fbddc75b273110a404a WatchSource:0}: Error finding container 1f14b1d9f8ec956e2f15e91e3acd1feb1bf1a9e896176fbddc75b273110a404a: Status 404 returned error can't find the container with id 1f14b1d9f8ec956e2f15e91e3acd1feb1bf1a9e896176fbddc75b273110a404a Feb 19 23:06:11 crc kubenswrapper[4795]: I0219 23:06:11.285853 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wlhqm" event={"ID":"497c4c82-13ae-430c-83bd-1f1c4d4683e4","Type":"ContainerStarted","Data":"1f14b1d9f8ec956e2f15e91e3acd1feb1bf1a9e896176fbddc75b273110a404a"} Feb 19 23:06:13 crc kubenswrapper[4795]: I0219 23:06:13.701661 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:13 crc kubenswrapper[4795]: I0219 23:06:13.702002 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:17 crc kubenswrapper[4795]: I0219 23:06:17.100277 4795 scope.go:117] "RemoveContainer" containerID="247acf34bea1d6df95accdf51f9a624b45c32153a0b9fa97bc69d2358a9601e8" Feb 19 23:06:17 crc kubenswrapper[4795]: I0219 23:06:17.120736 4795 scope.go:117] "RemoveContainer" containerID="82b2542be6e058170f19505c42c4114714e88b52257f1c0e99664dda5eb05f78" Feb 19 23:06:17 crc kubenswrapper[4795]: I0219 23:06:17.169786 4795 scope.go:117] "RemoveContainer" containerID="a01a032519bdc879e7f051388c3f7e0f8289504340bef813d7d1864b844d8771" Feb 19 23:06:17 crc kubenswrapper[4795]: I0219 23:06:17.357568 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wlhqm" event={"ID":"497c4c82-13ae-430c-83bd-1f1c4d4683e4","Type":"ContainerStarted","Data":"26cce21cbd7189a101a6a725aa7d2a769b17bb5f8957270015deb5068ba381a3"} Feb 19 23:06:17 crc kubenswrapper[4795]: I0219 23:06:17.386031 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-wlhqm" podStartSLOduration=1.585032536 podStartE2EDuration="7.386006896s" podCreationTimestamp="2026-02-19 23:06:10 +0000 UTC" firstStartedPulling="2026-02-19 23:06:11.128320828 +0000 UTC m=+5882.320838692" lastFinishedPulling="2026-02-19 23:06:16.929295178 +0000 UTC m=+5888.121813052" observedRunningTime="2026-02-19 23:06:17.373366509 +0000 UTC m=+5888.565884384" watchObservedRunningTime="2026-02-19 23:06:17.386006896 +0000 UTC m=+5888.578524780" Feb 19 23:06:20 crc kubenswrapper[4795]: I0219 23:06:20.387454 4795 generic.go:334] "Generic (PLEG): container finished" podID="497c4c82-13ae-430c-83bd-1f1c4d4683e4" containerID="26cce21cbd7189a101a6a725aa7d2a769b17bb5f8957270015deb5068ba381a3" exitCode=0 Feb 19 23:06:20 crc kubenswrapper[4795]: I0219 23:06:20.387588 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wlhqm" event={"ID":"497c4c82-13ae-430c-83bd-1f1c4d4683e4","Type":"ContainerDied","Data":"26cce21cbd7189a101a6a725aa7d2a769b17bb5f8957270015deb5068ba381a3"} Feb 19 23:06:20 crc kubenswrapper[4795]: I0219 23:06:20.511962 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:06:20 crc kubenswrapper[4795]: E0219 23:06:20.512400 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:06:21 crc kubenswrapper[4795]: I0219 23:06:21.828857 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:21 crc kubenswrapper[4795]: I0219 23:06:21.987358 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf4f6\" (UniqueName: \"kubernetes.io/projected/497c4c82-13ae-430c-83bd-1f1c4d4683e4-kube-api-access-nf4f6\") pod \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " Feb 19 23:06:21 crc kubenswrapper[4795]: I0219 23:06:21.987548 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-config-data\") pod \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " Feb 19 23:06:21 crc kubenswrapper[4795]: I0219 23:06:21.987768 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-combined-ca-bundle\") pod \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\" (UID: \"497c4c82-13ae-430c-83bd-1f1c4d4683e4\") " Feb 19 23:06:21 crc kubenswrapper[4795]: I0219 23:06:21.997140 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/497c4c82-13ae-430c-83bd-1f1c4d4683e4-kube-api-access-nf4f6" (OuterVolumeSpecName: "kube-api-access-nf4f6") pod "497c4c82-13ae-430c-83bd-1f1c4d4683e4" (UID: "497c4c82-13ae-430c-83bd-1f1c4d4683e4"). InnerVolumeSpecName "kube-api-access-nf4f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:22 crc kubenswrapper[4795]: I0219 23:06:22.020699 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "497c4c82-13ae-430c-83bd-1f1c4d4683e4" (UID: "497c4c82-13ae-430c-83bd-1f1c4d4683e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:22 crc kubenswrapper[4795]: I0219 23:06:22.065405 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-config-data" (OuterVolumeSpecName: "config-data") pod "497c4c82-13ae-430c-83bd-1f1c4d4683e4" (UID: "497c4c82-13ae-430c-83bd-1f1c4d4683e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:22 crc kubenswrapper[4795]: I0219 23:06:22.089917 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:22 crc kubenswrapper[4795]: I0219 23:06:22.090413 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497c4c82-13ae-430c-83bd-1f1c4d4683e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:22 crc kubenswrapper[4795]: I0219 23:06:22.090606 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf4f6\" (UniqueName: \"kubernetes.io/projected/497c4c82-13ae-430c-83bd-1f1c4d4683e4-kube-api-access-nf4f6\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:22 crc kubenswrapper[4795]: I0219 23:06:22.406973 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wlhqm" event={"ID":"497c4c82-13ae-430c-83bd-1f1c4d4683e4","Type":"ContainerDied","Data":"1f14b1d9f8ec956e2f15e91e3acd1feb1bf1a9e896176fbddc75b273110a404a"} Feb 19 23:06:22 crc kubenswrapper[4795]: I0219 23:06:22.407032 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f14b1d9f8ec956e2f15e91e3acd1feb1bf1a9e896176fbddc75b273110a404a" Feb 19 23:06:22 crc kubenswrapper[4795]: I0219 23:06:22.407071 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wlhqm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.505960 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-54b48c7f4c-97pnj"] Feb 19 23:06:23 crc kubenswrapper[4795]: E0219 23:06:23.508902 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497c4c82-13ae-430c-83bd-1f1c4d4683e4" containerName="heat-db-sync" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.508931 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="497c4c82-13ae-430c-83bd-1f1c4d4683e4" containerName="heat-db-sync" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.509214 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="497c4c82-13ae-430c-83bd-1f1c4d4683e4" containerName="heat-db-sync" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.510183 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.517858 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.517891 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-67jp9" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.518148 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.529112 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-54b48c7f4c-97pnj"] Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.621748 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-config-data-custom\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.622142 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-combined-ca-bundle\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.622219 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-config-data\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.622260 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xx2s\" (UniqueName: \"kubernetes.io/projected/a380f130-e904-41e8-90e2-93bdeb0615d6-kube-api-access-5xx2s\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.691488 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6565dd9f4d-w85dm"] Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.692957 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.696199 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.723983 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6565dd9f4d-w85dm"] Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.725176 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-config-data-custom\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.725313 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-combined-ca-bundle\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.725344 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-config-data\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.725367 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xx2s\" (UniqueName: \"kubernetes.io/projected/a380f130-e904-41e8-90e2-93bdeb0615d6-kube-api-access-5xx2s\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.763296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-combined-ca-bundle\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.766613 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xx2s\" (UniqueName: \"kubernetes.io/projected/a380f130-e904-41e8-90e2-93bdeb0615d6-kube-api-access-5xx2s\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.777916 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-config-data-custom\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.822885 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-57679899bc-rj6x7"] Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.826150 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a380f130-e904-41e8-90e2-93bdeb0615d6-config-data\") pod \"heat-engine-54b48c7f4c-97pnj\" (UID: \"a380f130-e904-41e8-90e2-93bdeb0615d6\") " pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.839333 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-config-data\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.839394 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-combined-ca-bundle\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.839416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdcgf\" (UniqueName: \"kubernetes.io/projected/94ea6e46-bacd-40ca-bce9-0f28656581af-kube-api-access-fdcgf\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.839499 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-config-data-custom\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.840411 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.842371 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.843069 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.873978 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-57679899bc-rj6x7"] Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.942063 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-combined-ca-bundle\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.942128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdcgf\" (UniqueName: \"kubernetes.io/projected/94ea6e46-bacd-40ca-bce9-0f28656581af-kube-api-access-fdcgf\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.942218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-248z5\" (UniqueName: \"kubernetes.io/projected/058c5b61-3ec2-4a88-bea8-59843d00750c-kube-api-access-248z5\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.942299 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-config-data\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.942397 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-config-data-custom\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.942554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-config-data-custom\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.942623 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-combined-ca-bundle\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.942919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-config-data\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.948534 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-config-data\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.960184 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-config-data-custom\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.964787 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ea6e46-bacd-40ca-bce9-0f28656581af-combined-ca-bundle\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:23 crc kubenswrapper[4795]: I0219 23:06:23.965343 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdcgf\" (UniqueName: \"kubernetes.io/projected/94ea6e46-bacd-40ca-bce9-0f28656581af-kube-api-access-fdcgf\") pod \"heat-cfnapi-6565dd9f4d-w85dm\" (UID: \"94ea6e46-bacd-40ca-bce9-0f28656581af\") " pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.016115 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.044767 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-combined-ca-bundle\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.045289 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-248z5\" (UniqueName: \"kubernetes.io/projected/058c5b61-3ec2-4a88-bea8-59843d00750c-kube-api-access-248z5\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.045324 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-config-data\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.045347 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-config-data-custom\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.052044 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-combined-ca-bundle\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.055344 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-config-data\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.056536 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/058c5b61-3ec2-4a88-bea8-59843d00750c-config-data-custom\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.061222 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-248z5\" (UniqueName: \"kubernetes.io/projected/058c5b61-3ec2-4a88-bea8-59843d00750c-kube-api-access-248z5\") pod \"heat-api-57679899bc-rj6x7\" (UID: \"058c5b61-3ec2-4a88-bea8-59843d00750c\") " pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.342345 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.361674 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-54b48c7f4c-97pnj"] Feb 19 23:06:24 crc kubenswrapper[4795]: W0219 23:06:24.374099 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda380f130_e904_41e8_90e2_93bdeb0615d6.slice/crio-42e93ff6f262ded38d30a4699b66ef3748d0559062d10f8e7d7630ce3a9f97a2 WatchSource:0}: Error finding container 42e93ff6f262ded38d30a4699b66ef3748d0559062d10f8e7d7630ce3a9f97a2: Status 404 returned error can't find the container with id 42e93ff6f262ded38d30a4699b66ef3748d0559062d10f8e7d7630ce3a9f97a2 Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.435498 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-54b48c7f4c-97pnj" event={"ID":"a380f130-e904-41e8-90e2-93bdeb0615d6","Type":"ContainerStarted","Data":"42e93ff6f262ded38d30a4699b66ef3748d0559062d10f8e7d7630ce3a9f97a2"} Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.502100 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6565dd9f4d-w85dm"] Feb 19 23:06:24 crc kubenswrapper[4795]: I0219 23:06:24.824179 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-57679899bc-rj6x7"] Feb 19 23:06:25 crc kubenswrapper[4795]: I0219 23:06:25.450739 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-57679899bc-rj6x7" event={"ID":"058c5b61-3ec2-4a88-bea8-59843d00750c","Type":"ContainerStarted","Data":"7c2569b462a1849b91272409407428ae61f0f85c98af819c9c77c8800b7411e2"} Feb 19 23:06:25 crc kubenswrapper[4795]: I0219 23:06:25.453026 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" event={"ID":"94ea6e46-bacd-40ca-bce9-0f28656581af","Type":"ContainerStarted","Data":"cf7360edbf9d31e08708caf5995d7041976a8fb9fdd9b6931540bad10df2f87c"} Feb 19 23:06:25 crc kubenswrapper[4795]: I0219 23:06:25.455763 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-54b48c7f4c-97pnj" event={"ID":"a380f130-e904-41e8-90e2-93bdeb0615d6","Type":"ContainerStarted","Data":"28e9017fdd45a316b9c170a0903ec6ce2597a64e8457cb18d9b0c1a4e16d2538"} Feb 19 23:06:25 crc kubenswrapper[4795]: I0219 23:06:25.455893 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:25 crc kubenswrapper[4795]: I0219 23:06:25.474516 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-54b48c7f4c-97pnj" podStartSLOduration=2.474494172 podStartE2EDuration="2.474494172s" podCreationTimestamp="2026-02-19 23:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:06:25.469265923 +0000 UTC m=+5896.661783817" watchObservedRunningTime="2026-02-19 23:06:25.474494172 +0000 UTC m=+5896.667012046" Feb 19 23:06:25 crc kubenswrapper[4795]: I0219 23:06:25.653199 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.475606 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-57679899bc-rj6x7" event={"ID":"058c5b61-3ec2-4a88-bea8-59843d00750c","Type":"ContainerStarted","Data":"7b7d0aeb0dbc4b713268e62f34080da6273676dd4d7d2d299501d03356596277"} Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.476211 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.477792 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" event={"ID":"94ea6e46-bacd-40ca-bce9-0f28656581af","Type":"ContainerStarted","Data":"9783318731f4eddc5ba92ad6da9e864b2d90e3fc020d553614c899f2e97786e4"} Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.477918 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.492556 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-57679899bc-rj6x7" podStartSLOduration=2.662771808 podStartE2EDuration="4.492541657s" podCreationTimestamp="2026-02-19 23:06:23 +0000 UTC" firstStartedPulling="2026-02-19 23:06:24.81680057 +0000 UTC m=+5896.009318434" lastFinishedPulling="2026-02-19 23:06:26.646570419 +0000 UTC m=+5897.839088283" observedRunningTime="2026-02-19 23:06:27.488206212 +0000 UTC m=+5898.680724076" watchObservedRunningTime="2026-02-19 23:06:27.492541657 +0000 UTC m=+5898.685059521" Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.513051 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" podStartSLOduration=2.386660232 podStartE2EDuration="4.513034573s" podCreationTimestamp="2026-02-19 23:06:23 +0000 UTC" firstStartedPulling="2026-02-19 23:06:24.517043594 +0000 UTC m=+5895.709561458" lastFinishedPulling="2026-02-19 23:06:26.643417935 +0000 UTC m=+5897.835935799" observedRunningTime="2026-02-19 23:06:27.508894263 +0000 UTC m=+5898.701412127" watchObservedRunningTime="2026-02-19 23:06:27.513034573 +0000 UTC m=+5898.705552437" Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.535089 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6f75767dd9-c8js2" Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.606059 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-548bf4c685-852ql"] Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.606352 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-548bf4c685-852ql" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon-log" containerID="cri-o://b946e5db8d3a52db0f1f9054f0099688c365289ec5bd26ac448c71a734370a40" gracePeriod=30 Feb 19 23:06:27 crc kubenswrapper[4795]: I0219 23:06:27.606956 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-548bf4c685-852ql" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon" containerID="cri-o://1bd2543092bc2960614f9063a0b09c79b76c851d29dedd3bce8880af20588398" gracePeriod=30 Feb 19 23:06:30 crc kubenswrapper[4795]: I0219 23:06:30.056256 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-57skz"] Feb 19 23:06:30 crc kubenswrapper[4795]: I0219 23:06:30.067309 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c270-account-create-update-m9p4w"] Feb 19 23:06:30 crc kubenswrapper[4795]: I0219 23:06:30.075587 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c270-account-create-update-m9p4w"] Feb 19 23:06:30 crc kubenswrapper[4795]: I0219 23:06:30.084801 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-57skz"] Feb 19 23:06:31 crc kubenswrapper[4795]: I0219 23:06:31.510282 4795 generic.go:334] "Generic (PLEG): container finished" podID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerID="1bd2543092bc2960614f9063a0b09c79b76c851d29dedd3bce8880af20588398" exitCode=0 Feb 19 23:06:31 crc kubenswrapper[4795]: I0219 23:06:31.510341 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548bf4c685-852ql" event={"ID":"353e54a0-06cb-4876-af76-78bcd1bb3a22","Type":"ContainerDied","Data":"1bd2543092bc2960614f9063a0b09c79b76c851d29dedd3bce8880af20588398"} Feb 19 23:06:31 crc kubenswrapper[4795]: I0219 23:06:31.528707 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5efc0b6-7441-4f4b-827e-d920c711d076" path="/var/lib/kubelet/pods/b5efc0b6-7441-4f4b-827e-d920c711d076/volumes" Feb 19 23:06:31 crc kubenswrapper[4795]: I0219 23:06:31.529321 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d0c29a-694d-4afc-ba36-c66fa8fd0328" path="/var/lib/kubelet/pods/e6d0c29a-694d-4afc-ba36-c66fa8fd0328/volumes" Feb 19 23:06:35 crc kubenswrapper[4795]: I0219 23:06:35.456635 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6565dd9f4d-w85dm" Feb 19 23:06:35 crc kubenswrapper[4795]: I0219 23:06:35.512009 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:06:35 crc kubenswrapper[4795]: E0219 23:06:35.512288 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:06:35 crc kubenswrapper[4795]: I0219 23:06:35.922377 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-57679899bc-rj6x7" Feb 19 23:06:36 crc kubenswrapper[4795]: I0219 23:06:36.748742 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-548bf4c685-852ql" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.113:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8080: connect: connection refused" Feb 19 23:06:39 crc kubenswrapper[4795]: I0219 23:06:39.032830 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-wrz6p"] Feb 19 23:06:39 crc kubenswrapper[4795]: I0219 23:06:39.042665 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-wrz6p"] Feb 19 23:06:39 crc kubenswrapper[4795]: I0219 23:06:39.522430 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e95033f-725f-4784-995c-ec7a3b9c24c4" path="/var/lib/kubelet/pods/3e95033f-725f-4784-995c-ec7a3b9c24c4/volumes" Feb 19 23:06:43 crc kubenswrapper[4795]: I0219 23:06:43.888228 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-54b48c7f4c-97pnj" Feb 19 23:06:46 crc kubenswrapper[4795]: I0219 23:06:46.748778 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-548bf4c685-852ql" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.113:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8080: connect: connection refused" Feb 19 23:06:48 crc kubenswrapper[4795]: I0219 23:06:48.512324 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:06:48 crc kubenswrapper[4795]: E0219 23:06:48.513128 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:06:53 crc kubenswrapper[4795]: I0219 23:06:53.866624 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk"] Feb 19 23:06:53 crc kubenswrapper[4795]: I0219 23:06:53.869690 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:53 crc kubenswrapper[4795]: I0219 23:06:53.871662 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 23:06:53 crc kubenswrapper[4795]: I0219 23:06:53.894896 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk"] Feb 19 23:06:53 crc kubenswrapper[4795]: I0219 23:06:53.997732 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg59p\" (UniqueName: \"kubernetes.io/projected/42ff3cee-7522-42a5-8cc9-b52b30d45220-kube-api-access-dg59p\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:53 crc kubenswrapper[4795]: I0219 23:06:53.998133 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:53 crc kubenswrapper[4795]: I0219 23:06:53.998348 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.100354 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg59p\" (UniqueName: \"kubernetes.io/projected/42ff3cee-7522-42a5-8cc9-b52b30d45220-kube-api-access-dg59p\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.100395 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.100550 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.100993 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.101189 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.122286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg59p\" (UniqueName: \"kubernetes.io/projected/42ff3cee-7522-42a5-8cc9-b52b30d45220-kube-api-access-dg59p\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.198068 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.630058 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk"] Feb 19 23:06:54 crc kubenswrapper[4795]: I0219 23:06:54.747486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" event={"ID":"42ff3cee-7522-42a5-8cc9-b52b30d45220","Type":"ContainerStarted","Data":"d11bd090c6a8a61659e1b16a0e54fc14ff621007a0c7499ea67644b6ce764c7f"} Feb 19 23:06:55 crc kubenswrapper[4795]: I0219 23:06:55.765390 4795 generic.go:334] "Generic (PLEG): container finished" podID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerID="1b77aac585ca3cfe29b317bf1fd045a5d63798719b946034f72426232115ecf7" exitCode=0 Feb 19 23:06:55 crc kubenswrapper[4795]: I0219 23:06:55.765431 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" event={"ID":"42ff3cee-7522-42a5-8cc9-b52b30d45220","Type":"ContainerDied","Data":"1b77aac585ca3cfe29b317bf1fd045a5d63798719b946034f72426232115ecf7"} Feb 19 23:06:56 crc kubenswrapper[4795]: I0219 23:06:56.749000 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-548bf4c685-852ql" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.113:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8080: connect: connection refused" Feb 19 23:06:56 crc kubenswrapper[4795]: I0219 23:06:56.749439 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:06:57 crc kubenswrapper[4795]: I0219 23:06:57.795647 4795 generic.go:334] "Generic (PLEG): container finished" podID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerID="b946e5db8d3a52db0f1f9054f0099688c365289ec5bd26ac448c71a734370a40" exitCode=137 Feb 19 23:06:57 crc kubenswrapper[4795]: I0219 23:06:57.795958 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548bf4c685-852ql" event={"ID":"353e54a0-06cb-4876-af76-78bcd1bb3a22","Type":"ContainerDied","Data":"b946e5db8d3a52db0f1f9054f0099688c365289ec5bd26ac448c71a734370a40"} Feb 19 23:06:57 crc kubenswrapper[4795]: I0219 23:06:57.799033 4795 generic.go:334] "Generic (PLEG): container finished" podID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerID="7325a5f3066d7e846cc85a6bea1dd01396196730c91dbd4e526befb9417a41b7" exitCode=0 Feb 19 23:06:57 crc kubenswrapper[4795]: I0219 23:06:57.799087 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" event={"ID":"42ff3cee-7522-42a5-8cc9-b52b30d45220","Type":"ContainerDied","Data":"7325a5f3066d7e846cc85a6bea1dd01396196730c91dbd4e526befb9417a41b7"} Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.063196 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.201056 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/353e54a0-06cb-4876-af76-78bcd1bb3a22-horizon-secret-key\") pod \"353e54a0-06cb-4876-af76-78bcd1bb3a22\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.201178 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-scripts\") pod \"353e54a0-06cb-4876-af76-78bcd1bb3a22\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.201255 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/353e54a0-06cb-4876-af76-78bcd1bb3a22-logs\") pod \"353e54a0-06cb-4876-af76-78bcd1bb3a22\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.201278 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-config-data\") pod \"353e54a0-06cb-4876-af76-78bcd1bb3a22\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.201393 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svjbq\" (UniqueName: \"kubernetes.io/projected/353e54a0-06cb-4876-af76-78bcd1bb3a22-kube-api-access-svjbq\") pod \"353e54a0-06cb-4876-af76-78bcd1bb3a22\" (UID: \"353e54a0-06cb-4876-af76-78bcd1bb3a22\") " Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.201923 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/353e54a0-06cb-4876-af76-78bcd1bb3a22-logs" (OuterVolumeSpecName: "logs") pod "353e54a0-06cb-4876-af76-78bcd1bb3a22" (UID: "353e54a0-06cb-4876-af76-78bcd1bb3a22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.215399 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/353e54a0-06cb-4876-af76-78bcd1bb3a22-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "353e54a0-06cb-4876-af76-78bcd1bb3a22" (UID: "353e54a0-06cb-4876-af76-78bcd1bb3a22"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.221932 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/353e54a0-06cb-4876-af76-78bcd1bb3a22-kube-api-access-svjbq" (OuterVolumeSpecName: "kube-api-access-svjbq") pod "353e54a0-06cb-4876-af76-78bcd1bb3a22" (UID: "353e54a0-06cb-4876-af76-78bcd1bb3a22"). InnerVolumeSpecName "kube-api-access-svjbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.227380 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-scripts" (OuterVolumeSpecName: "scripts") pod "353e54a0-06cb-4876-af76-78bcd1bb3a22" (UID: "353e54a0-06cb-4876-af76-78bcd1bb3a22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.266269 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-config-data" (OuterVolumeSpecName: "config-data") pod "353e54a0-06cb-4876-af76-78bcd1bb3a22" (UID: "353e54a0-06cb-4876-af76-78bcd1bb3a22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.303960 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svjbq\" (UniqueName: \"kubernetes.io/projected/353e54a0-06cb-4876-af76-78bcd1bb3a22-kube-api-access-svjbq\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.303998 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/353e54a0-06cb-4876-af76-78bcd1bb3a22-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.304008 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.304018 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/353e54a0-06cb-4876-af76-78bcd1bb3a22-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.304026 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/353e54a0-06cb-4876-af76-78bcd1bb3a22-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.810832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-548bf4c685-852ql" event={"ID":"353e54a0-06cb-4876-af76-78bcd1bb3a22","Type":"ContainerDied","Data":"582b797ed9c1b4bfe8384f717fb4dee523c0a67a75d13d2ff254fdbeb84f1617"} Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.810865 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-548bf4c685-852ql" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.810897 4795 scope.go:117] "RemoveContainer" containerID="1bd2543092bc2960614f9063a0b09c79b76c851d29dedd3bce8880af20588398" Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.814832 4795 generic.go:334] "Generic (PLEG): container finished" podID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerID="89a80ef894aa5f55ca88f1134be53fdc59b583e2230ceb9fdaf35bfbb0fe8774" exitCode=0 Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.814884 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" event={"ID":"42ff3cee-7522-42a5-8cc9-b52b30d45220","Type":"ContainerDied","Data":"89a80ef894aa5f55ca88f1134be53fdc59b583e2230ceb9fdaf35bfbb0fe8774"} Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.870317 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-548bf4c685-852ql"] Feb 19 23:06:58 crc kubenswrapper[4795]: I0219 23:06:58.880668 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-548bf4c685-852ql"] Feb 19 23:06:59 crc kubenswrapper[4795]: I0219 23:06:59.027138 4795 scope.go:117] "RemoveContainer" containerID="b946e5db8d3a52db0f1f9054f0099688c365289ec5bd26ac448c71a734370a40" Feb 19 23:06:59 crc kubenswrapper[4795]: I0219 23:06:59.525714 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" path="/var/lib/kubelet/pods/353e54a0-06cb-4876-af76-78bcd1bb3a22/volumes" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.183419 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.345553 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-util\") pod \"42ff3cee-7522-42a5-8cc9-b52b30d45220\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.345693 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-bundle\") pod \"42ff3cee-7522-42a5-8cc9-b52b30d45220\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.345732 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg59p\" (UniqueName: \"kubernetes.io/projected/42ff3cee-7522-42a5-8cc9-b52b30d45220-kube-api-access-dg59p\") pod \"42ff3cee-7522-42a5-8cc9-b52b30d45220\" (UID: \"42ff3cee-7522-42a5-8cc9-b52b30d45220\") " Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.350601 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-bundle" (OuterVolumeSpecName: "bundle") pod "42ff3cee-7522-42a5-8cc9-b52b30d45220" (UID: "42ff3cee-7522-42a5-8cc9-b52b30d45220"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.353312 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42ff3cee-7522-42a5-8cc9-b52b30d45220-kube-api-access-dg59p" (OuterVolumeSpecName: "kube-api-access-dg59p") pod "42ff3cee-7522-42a5-8cc9-b52b30d45220" (UID: "42ff3cee-7522-42a5-8cc9-b52b30d45220"). InnerVolumeSpecName "kube-api-access-dg59p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.357868 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-util" (OuterVolumeSpecName: "util") pod "42ff3cee-7522-42a5-8cc9-b52b30d45220" (UID: "42ff3cee-7522-42a5-8cc9-b52b30d45220"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.447934 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-util\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.447973 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42ff3cee-7522-42a5-8cc9-b52b30d45220-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.447987 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg59p\" (UniqueName: \"kubernetes.io/projected/42ff3cee-7522-42a5-8cc9-b52b30d45220-kube-api-access-dg59p\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.839113 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" event={"ID":"42ff3cee-7522-42a5-8cc9-b52b30d45220","Type":"ContainerDied","Data":"d11bd090c6a8a61659e1b16a0e54fc14ff621007a0c7499ea67644b6ce764c7f"} Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.839160 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d11bd090c6a8a61659e1b16a0e54fc14ff621007a0c7499ea67644b6ce764c7f" Feb 19 23:07:00 crc kubenswrapper[4795]: I0219 23:07:00.839230 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk" Feb 19 23:07:03 crc kubenswrapper[4795]: I0219 23:07:03.511606 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:07:03 crc kubenswrapper[4795]: E0219 23:07:03.512184 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.056512 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-w9m97"] Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.065472 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-4561-account-create-update-zf5q8"] Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.073945 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-w9m97"] Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.083000 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-4561-account-create-update-zf5q8"] Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.886052 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw"] Feb 19 23:07:10 crc kubenswrapper[4795]: E0219 23:07:10.887953 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.888074 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon" Feb 19 23:07:10 crc kubenswrapper[4795]: E0219 23:07:10.888205 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerName="pull" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.888291 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerName="pull" Feb 19 23:07:10 crc kubenswrapper[4795]: E0219 23:07:10.888604 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerName="util" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.888703 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerName="util" Feb 19 23:07:10 crc kubenswrapper[4795]: E0219 23:07:10.888789 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerName="extract" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.888867 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerName="extract" Feb 19 23:07:10 crc kubenswrapper[4795]: E0219 23:07:10.888964 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon-log" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.889038 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon-log" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.889802 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="42ff3cee-7522-42a5-8cc9-b52b30d45220" containerName="extract" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.889912 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon-log" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.890006 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="353e54a0-06cb-4876-af76-78bcd1bb3a22" containerName="horizon" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.912017 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.916301 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-4787d" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.918609 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.918986 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 19 23:07:10 crc kubenswrapper[4795]: I0219 23:07:10.920904 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.030477 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.031906 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.037191 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.037650 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-tfsnj" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.048736 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.050183 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.081299 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.099324 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.104955 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87vck\" (UniqueName: \"kubernetes.io/projected/0a29e309-2974-42a7-afd9-c77d17f414d0-kube-api-access-87vck\") pod \"obo-prometheus-operator-68bc856cb9-s52kw\" (UID: \"0a29e309-2974-42a7-afd9-c77d17f414d0\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.186424 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-ls2vk"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.188668 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.192361 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.192645 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-48rz4" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.205821 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-ls2vk"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.206104 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb0e3807-a209-43ca-a245-64283a1d021f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb\" (UID: \"fb0e3807-a209-43ca-a245-64283a1d021f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.206371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/658abf91-1e8b-4182-998f-76d3ed17b836-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd\" (UID: \"658abf91-1e8b-4182-998f-76d3ed17b836\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.206433 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb0e3807-a209-43ca-a245-64283a1d021f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb\" (UID: \"fb0e3807-a209-43ca-a245-64283a1d021f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.206719 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/658abf91-1e8b-4182-998f-76d3ed17b836-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd\" (UID: \"658abf91-1e8b-4182-998f-76d3ed17b836\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.206792 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87vck\" (UniqueName: \"kubernetes.io/projected/0a29e309-2974-42a7-afd9-c77d17f414d0-kube-api-access-87vck\") pod \"obo-prometheus-operator-68bc856cb9-s52kw\" (UID: \"0a29e309-2974-42a7-afd9-c77d17f414d0\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.232069 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87vck\" (UniqueName: \"kubernetes.io/projected/0a29e309-2974-42a7-afd9-c77d17f414d0-kube-api-access-87vck\") pod \"obo-prometheus-operator-68bc856cb9-s52kw\" (UID: \"0a29e309-2974-42a7-afd9-c77d17f414d0\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.245564 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.308346 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb0e3807-a209-43ca-a245-64283a1d021f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb\" (UID: \"fb0e3807-a209-43ca-a245-64283a1d021f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.308453 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d03bcc6-aa94-401a-9a3b-4970f64537cd-observability-operator-tls\") pod \"observability-operator-59bdc8b94-ls2vk\" (UID: \"5d03bcc6-aa94-401a-9a3b-4970f64537cd\") " pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.308496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/658abf91-1e8b-4182-998f-76d3ed17b836-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd\" (UID: \"658abf91-1e8b-4182-998f-76d3ed17b836\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.308562 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb0e3807-a209-43ca-a245-64283a1d021f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb\" (UID: \"fb0e3807-a209-43ca-a245-64283a1d021f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.308605 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6vnc\" (UniqueName: \"kubernetes.io/projected/5d03bcc6-aa94-401a-9a3b-4970f64537cd-kube-api-access-w6vnc\") pod \"observability-operator-59bdc8b94-ls2vk\" (UID: \"5d03bcc6-aa94-401a-9a3b-4970f64537cd\") " pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.308669 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/658abf91-1e8b-4182-998f-76d3ed17b836-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd\" (UID: \"658abf91-1e8b-4182-998f-76d3ed17b836\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.311804 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-w6sln"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.315312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb0e3807-a209-43ca-a245-64283a1d021f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb\" (UID: \"fb0e3807-a209-43ca-a245-64283a1d021f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.316643 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/658abf91-1e8b-4182-998f-76d3ed17b836-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd\" (UID: \"658abf91-1e8b-4182-998f-76d3ed17b836\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.317638 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb0e3807-a209-43ca-a245-64283a1d021f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb\" (UID: \"fb0e3807-a209-43ca-a245-64283a1d021f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.320320 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.329152 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-w6sln"] Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.331795 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/658abf91-1e8b-4182-998f-76d3ed17b836-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd\" (UID: \"658abf91-1e8b-4182-998f-76d3ed17b836\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.336968 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-qztjr" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.349943 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.375306 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.413981 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6vnc\" (UniqueName: \"kubernetes.io/projected/5d03bcc6-aa94-401a-9a3b-4970f64537cd-kube-api-access-w6vnc\") pod \"observability-operator-59bdc8b94-ls2vk\" (UID: \"5d03bcc6-aa94-401a-9a3b-4970f64537cd\") " pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.414388 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d03bcc6-aa94-401a-9a3b-4970f64537cd-observability-operator-tls\") pod \"observability-operator-59bdc8b94-ls2vk\" (UID: \"5d03bcc6-aa94-401a-9a3b-4970f64537cd\") " pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.423374 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d03bcc6-aa94-401a-9a3b-4970f64537cd-observability-operator-tls\") pod \"observability-operator-59bdc8b94-ls2vk\" (UID: \"5d03bcc6-aa94-401a-9a3b-4970f64537cd\") " pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.456962 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6vnc\" (UniqueName: \"kubernetes.io/projected/5d03bcc6-aa94-401a-9a3b-4970f64537cd-kube-api-access-w6vnc\") pod \"observability-operator-59bdc8b94-ls2vk\" (UID: \"5d03bcc6-aa94-401a-9a3b-4970f64537cd\") " pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.505691 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.518784 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnv7x\" (UniqueName: \"kubernetes.io/projected/e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b-kube-api-access-nnv7x\") pod \"perses-operator-5bf474d74f-w6sln\" (UID: \"e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b\") " pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.518968 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-w6sln\" (UID: \"e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b\") " pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.573472 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eda8a248-0107-4d34-a02b-6dbf30972c64" path="/var/lib/kubelet/pods/eda8a248-0107-4d34-a02b-6dbf30972c64/volumes" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.574946 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcc96fc8-80e4-4dda-af2e-91390b6af829" path="/var/lib/kubelet/pods/fcc96fc8-80e4-4dda-af2e-91390b6af829/volumes" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.642821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnv7x\" (UniqueName: \"kubernetes.io/projected/e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b-kube-api-access-nnv7x\") pod \"perses-operator-5bf474d74f-w6sln\" (UID: \"e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b\") " pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.643256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-w6sln\" (UID: \"e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b\") " pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.644093 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b-openshift-service-ca\") pod \"perses-operator-5bf474d74f-w6sln\" (UID: \"e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b\") " pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.673977 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnv7x\" (UniqueName: \"kubernetes.io/projected/e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b-kube-api-access-nnv7x\") pod \"perses-operator-5bf474d74f-w6sln\" (UID: \"e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b\") " pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:11 crc kubenswrapper[4795]: I0219 23:07:11.801600 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:12 crc kubenswrapper[4795]: I0219 23:07:12.042506 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw"] Feb 19 23:07:12 crc kubenswrapper[4795]: W0219 23:07:12.058885 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a29e309_2974_42a7_afd9_c77d17f414d0.slice/crio-8a5309c92897f41bee68a6bab50234526b7efd5a8bc86ab390db5fb079f34e38 WatchSource:0}: Error finding container 8a5309c92897f41bee68a6bab50234526b7efd5a8bc86ab390db5fb079f34e38: Status 404 returned error can't find the container with id 8a5309c92897f41bee68a6bab50234526b7efd5a8bc86ab390db5fb079f34e38 Feb 19 23:07:12 crc kubenswrapper[4795]: W0219 23:07:12.180391 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod658abf91_1e8b_4182_998f_76d3ed17b836.slice/crio-13b9004e1c9a968a77ed9603a1c5f1f794c4e77bb4ee543a655ae6d8441de5cf WatchSource:0}: Error finding container 13b9004e1c9a968a77ed9603a1c5f1f794c4e77bb4ee543a655ae6d8441de5cf: Status 404 returned error can't find the container with id 13b9004e1c9a968a77ed9603a1c5f1f794c4e77bb4ee543a655ae6d8441de5cf Feb 19 23:07:12 crc kubenswrapper[4795]: I0219 23:07:12.189106 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb"] Feb 19 23:07:12 crc kubenswrapper[4795]: I0219 23:07:12.202565 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd"] Feb 19 23:07:12 crc kubenswrapper[4795]: I0219 23:07:12.347703 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-ls2vk"] Feb 19 23:07:12 crc kubenswrapper[4795]: W0219 23:07:12.353738 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d03bcc6_aa94_401a_9a3b_4970f64537cd.slice/crio-61354bba44c68b07c36a9ad14176942036125815a1d4e68bdcc378739045a92c WatchSource:0}: Error finding container 61354bba44c68b07c36a9ad14176942036125815a1d4e68bdcc378739045a92c: Status 404 returned error can't find the container with id 61354bba44c68b07c36a9ad14176942036125815a1d4e68bdcc378739045a92c Feb 19 23:07:12 crc kubenswrapper[4795]: I0219 23:07:12.460323 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-w6sln"] Feb 19 23:07:12 crc kubenswrapper[4795]: I0219 23:07:12.995007 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw" event={"ID":"0a29e309-2974-42a7-afd9-c77d17f414d0","Type":"ContainerStarted","Data":"8a5309c92897f41bee68a6bab50234526b7efd5a8bc86ab390db5fb079f34e38"} Feb 19 23:07:13 crc kubenswrapper[4795]: I0219 23:07:13.012040 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" event={"ID":"658abf91-1e8b-4182-998f-76d3ed17b836","Type":"ContainerStarted","Data":"13b9004e1c9a968a77ed9603a1c5f1f794c4e77bb4ee543a655ae6d8441de5cf"} Feb 19 23:07:13 crc kubenswrapper[4795]: I0219 23:07:13.013298 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" event={"ID":"fb0e3807-a209-43ca-a245-64283a1d021f","Type":"ContainerStarted","Data":"defa1ed0cd56e005c852167667d4d7f1341390b5bd35c25ff6a70d1ae1f69160"} Feb 19 23:07:13 crc kubenswrapper[4795]: I0219 23:07:13.015486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-w6sln" event={"ID":"e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b","Type":"ContainerStarted","Data":"a4a35f9a7530a173b815a5a8fbaf856d5a542972a107e0bf7eb3da7d799df193"} Feb 19 23:07:13 crc kubenswrapper[4795]: I0219 23:07:13.016554 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" event={"ID":"5d03bcc6-aa94-401a-9a3b-4970f64537cd","Type":"ContainerStarted","Data":"61354bba44c68b07c36a9ad14176942036125815a1d4e68bdcc378739045a92c"} Feb 19 23:07:14 crc kubenswrapper[4795]: I0219 23:07:14.512586 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:07:14 crc kubenswrapper[4795]: E0219 23:07:14.513152 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:07:17 crc kubenswrapper[4795]: I0219 23:07:17.042482 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-srnhx"] Feb 19 23:07:17 crc kubenswrapper[4795]: I0219 23:07:17.050739 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-srnhx"] Feb 19 23:07:17 crc kubenswrapper[4795]: I0219 23:07:17.355387 4795 scope.go:117] "RemoveContainer" containerID="a92e1a832062a3ff2c20dc8cd15ae2f139e651b81902a3e2ee439ec6e20b2123" Feb 19 23:07:17 crc kubenswrapper[4795]: I0219 23:07:17.718701 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7286d7ba-7f8c-4f40-a18a-d29af788c344" path="/var/lib/kubelet/pods/7286d7ba-7f8c-4f40-a18a-d29af788c344/volumes" Feb 19 23:07:24 crc kubenswrapper[4795]: I0219 23:07:24.784658 4795 scope.go:117] "RemoveContainer" containerID="b7a9bfd4cc9f1de56acd0e84791ea820651eb87fe9629808f92a9d7ea31fba08" Feb 19 23:07:24 crc kubenswrapper[4795]: I0219 23:07:24.850400 4795 scope.go:117] "RemoveContainer" containerID="ce12984ea586896da4a3a2ca9a12c46a23d3b89ee0886c7e9ee2b6ceb73add38" Feb 19 23:07:24 crc kubenswrapper[4795]: I0219 23:07:24.965942 4795 scope.go:117] "RemoveContainer" containerID="2fce75ec54d1aa6599ee05902a3278bc712bcedfa4ab8adb5ae0a48b2003dc71" Feb 19 23:07:25 crc kubenswrapper[4795]: I0219 23:07:25.073800 4795 scope.go:117] "RemoveContainer" containerID="406b01e5656ffc8e932bd5ff0af64ce799789e55eeb816cb84ffd89f44da61ef" Feb 19 23:07:25 crc kubenswrapper[4795]: I0219 23:07:25.173581 4795 scope.go:117] "RemoveContainer" containerID="dbe34d14d5acf3b90d2e3b7465f76c19cc181a6a6717aade5162b1682beb16f4" Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.207575 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" event={"ID":"fb0e3807-a209-43ca-a245-64283a1d021f","Type":"ContainerStarted","Data":"682b8f555a3cbdac5c847614e8e8f922fae5c657c652b1d0d67c7b3900ca001c"} Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.208996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-w6sln" event={"ID":"e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b","Type":"ContainerStarted","Data":"22e157950a34b88b6beab18352898d9a491eafe4a19446a758769a2c448fa1a3"} Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.209064 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.210074 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw" event={"ID":"0a29e309-2974-42a7-afd9-c77d17f414d0","Type":"ContainerStarted","Data":"cc7d6b7a70dbc038ef0a36ea561b3ce2b71efb27a24befaf68ed450c51a65c2f"} Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.211426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" event={"ID":"658abf91-1e8b-4182-998f-76d3ed17b836","Type":"ContainerStarted","Data":"54ad2b3df4352c8c6d60ccf5835f2c0cb6d61d704d51733ce8f5dbbeca69fa26"} Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.212546 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" event={"ID":"5d03bcc6-aa94-401a-9a3b-4970f64537cd","Type":"ContainerStarted","Data":"5161f4538ca8f54a75b3d76362bad3b3b48c240663435771edeff5a8bdf806ad"} Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.212895 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.220764 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.236761 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb" podStartSLOduration=2.57395792 podStartE2EDuration="15.236740175s" podCreationTimestamp="2026-02-19 23:07:11 +0000 UTC" firstStartedPulling="2026-02-19 23:07:12.188810146 +0000 UTC m=+5943.381328010" lastFinishedPulling="2026-02-19 23:07:24.851592401 +0000 UTC m=+5956.044110265" observedRunningTime="2026-02-19 23:07:26.222009483 +0000 UTC m=+5957.414527347" watchObservedRunningTime="2026-02-19 23:07:26.236740175 +0000 UTC m=+5957.429258039" Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.250252 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd" podStartSLOduration=2.512897112 podStartE2EDuration="15.250235565s" podCreationTimestamp="2026-02-19 23:07:11 +0000 UTC" firstStartedPulling="2026-02-19 23:07:12.186927535 +0000 UTC m=+5943.379445399" lastFinishedPulling="2026-02-19 23:07:24.924265998 +0000 UTC m=+5956.116783852" observedRunningTime="2026-02-19 23:07:26.244208044 +0000 UTC m=+5957.436725908" watchObservedRunningTime="2026-02-19 23:07:26.250235565 +0000 UTC m=+5957.442753429" Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.314939 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-w6sln" podStartSLOduration=2.92790728 podStartE2EDuration="15.314919778s" podCreationTimestamp="2026-02-19 23:07:11 +0000 UTC" firstStartedPulling="2026-02-19 23:07:12.463560896 +0000 UTC m=+5943.656078760" lastFinishedPulling="2026-02-19 23:07:24.850573394 +0000 UTC m=+5956.043091258" observedRunningTime="2026-02-19 23:07:26.301686566 +0000 UTC m=+5957.494204430" watchObservedRunningTime="2026-02-19 23:07:26.314919778 +0000 UTC m=+5957.507437642" Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.374183 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-s52kw" podStartSLOduration=3.561444336 podStartE2EDuration="16.374153676s" podCreationTimestamp="2026-02-19 23:07:10 +0000 UTC" firstStartedPulling="2026-02-19 23:07:12.061670859 +0000 UTC m=+5943.254188723" lastFinishedPulling="2026-02-19 23:07:24.874380209 +0000 UTC m=+5956.066898063" observedRunningTime="2026-02-19 23:07:26.342479893 +0000 UTC m=+5957.534997757" watchObservedRunningTime="2026-02-19 23:07:26.374153676 +0000 UTC m=+5957.566671540" Feb 19 23:07:26 crc kubenswrapper[4795]: I0219 23:07:26.380627 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-ls2vk" podStartSLOduration=2.770883836 podStartE2EDuration="15.380614999s" podCreationTimestamp="2026-02-19 23:07:11 +0000 UTC" firstStartedPulling="2026-02-19 23:07:12.355187098 +0000 UTC m=+5943.547704962" lastFinishedPulling="2026-02-19 23:07:24.964918261 +0000 UTC m=+5956.157436125" observedRunningTime="2026-02-19 23:07:26.372715748 +0000 UTC m=+5957.565233612" watchObservedRunningTime="2026-02-19 23:07:26.380614999 +0000 UTC m=+5957.573132853" Feb 19 23:07:28 crc kubenswrapper[4795]: I0219 23:07:28.511536 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:07:29 crc kubenswrapper[4795]: I0219 23:07:29.239842 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"b0fe96e51c4e702a3b8fdcd9d997ef35d626772b563d6c998f84fa6863685a9d"} Feb 19 23:07:31 crc kubenswrapper[4795]: I0219 23:07:31.805005 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-w6sln" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.624265 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.625903 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="54e90f84-703c-41b3-85c2-dd4ce9e3a968" containerName="openstackclient" containerID="cri-o://3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd" gracePeriod=2 Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.636985 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.703624 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 23:07:34 crc kubenswrapper[4795]: E0219 23:07:34.704346 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e90f84-703c-41b3-85c2-dd4ce9e3a968" containerName="openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.704363 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e90f84-703c-41b3-85c2-dd4ce9e3a968" containerName="openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.704567 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e90f84-703c-41b3-85c2-dd4ce9e3a968" containerName="openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.705226 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.710380 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="54e90f84-703c-41b3-85c2-dd4ce9e3a968" podUID="f1d06b1e-9114-47b8-913d-86144f6314c3" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.737262 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.854204 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.856319 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.865598 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-l7j8w" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.866353 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f1d06b1e-9114-47b8-913d-86144f6314c3-openstack-config-secret\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.866431 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b94s\" (UniqueName: \"kubernetes.io/projected/f1d06b1e-9114-47b8-913d-86144f6314c3-kube-api-access-4b94s\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.866469 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f1d06b1e-9114-47b8-913d-86144f6314c3-openstack-config\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.875713 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.969419 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f1d06b1e-9114-47b8-913d-86144f6314c3-openstack-config-secret\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.969473 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b94s\" (UniqueName: \"kubernetes.io/projected/f1d06b1e-9114-47b8-913d-86144f6314c3-kube-api-access-4b94s\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.969500 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f1d06b1e-9114-47b8-913d-86144f6314c3-openstack-config\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.969541 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqtks\" (UniqueName: \"kubernetes.io/projected/5a2a47de-c40d-40c9-8556-ea7033a4033b-kube-api-access-zqtks\") pod \"kube-state-metrics-0\" (UID: \"5a2a47de-c40d-40c9-8556-ea7033a4033b\") " pod="openstack/kube-state-metrics-0" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.974111 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f1d06b1e-9114-47b8-913d-86144f6314c3-openstack-config\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:34 crc kubenswrapper[4795]: I0219 23:07:34.978372 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f1d06b1e-9114-47b8-913d-86144f6314c3-openstack-config-secret\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.035472 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b94s\" (UniqueName: \"kubernetes.io/projected/f1d06b1e-9114-47b8-913d-86144f6314c3-kube-api-access-4b94s\") pod \"openstackclient\" (UID: \"f1d06b1e-9114-47b8-913d-86144f6314c3\") " pod="openstack/openstackclient" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.071733 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqtks\" (UniqueName: \"kubernetes.io/projected/5a2a47de-c40d-40c9-8556-ea7033a4033b-kube-api-access-zqtks\") pod \"kube-state-metrics-0\" (UID: \"5a2a47de-c40d-40c9-8556-ea7033a4033b\") " pod="openstack/kube-state-metrics-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.115961 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqtks\" (UniqueName: \"kubernetes.io/projected/5a2a47de-c40d-40c9-8556-ea7033a4033b-kube-api-access-zqtks\") pod \"kube-state-metrics-0\" (UID: \"5a2a47de-c40d-40c9-8556-ea7033a4033b\") " pod="openstack/kube-state-metrics-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.199726 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.329488 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.641640 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.667621 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.667713 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.688688 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.688774 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.688937 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.688686 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-dlgzw" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.688695 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.808396 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.808500 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.808599 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.808634 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.808656 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.808700 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.808746 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mznjs\" (UniqueName: \"kubernetes.io/projected/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-kube-api-access-mznjs\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.910672 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.910742 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mznjs\" (UniqueName: \"kubernetes.io/projected/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-kube-api-access-mznjs\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.910785 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.910844 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.910890 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.910918 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.910934 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.912254 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.917259 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.920544 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.930820 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.933049 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.935507 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mznjs\" (UniqueName: \"kubernetes.io/projected/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-kube-api-access-mznjs\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:35 crc kubenswrapper[4795]: I0219 23:07:35.949071 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5b60b6d-7ecf-424d-a297-f98fae5ef0a3-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.021664 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.075600 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.277395 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.279601 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.303352 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.303884 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.303484 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.303553 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.310075 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.310190 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.315943 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.338687 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-n4gjc" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.387738 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.387961 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5a2a47de-c40d-40c9-8556-ea7033a4033b","Type":"ContainerStarted","Data":"28e7859be678b09a2f3d8d670d0b4e0ab24a6551e502e177072a9b339370025d"} Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.408990 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489430 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-config\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489554 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489638 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489706 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/281b5fc0-7da4-4d5a-89d4-b073b1500865-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489753 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-50ab6ab7-e7d1-4664-a405-e36dbc6d1dc7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50ab6ab7-e7d1-4664-a405-e36dbc6d1dc7\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489834 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489881 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489949 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/281b5fc0-7da4-4d5a-89d4-b073b1500865-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.489995 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zsf5\" (UniqueName: \"kubernetes.io/projected/281b5fc0-7da4-4d5a-89d4-b073b1500865-kube-api-access-4zsf5\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.605829 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606297 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606430 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/281b5fc0-7da4-4d5a-89d4-b073b1500865-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606523 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zsf5\" (UniqueName: \"kubernetes.io/projected/281b5fc0-7da4-4d5a-89d4-b073b1500865-kube-api-access-4zsf5\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606636 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-config\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606748 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606839 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/281b5fc0-7da4-4d5a-89d4-b073b1500865-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.606980 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-50ab6ab7-e7d1-4664-a405-e36dbc6d1dc7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50ab6ab7-e7d1-4664-a405-e36dbc6d1dc7\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.610025 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.614459 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.617854 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/281b5fc0-7da4-4d5a-89d4-b073b1500865-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.622431 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-config\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.622741 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.632882 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/281b5fc0-7da4-4d5a-89d4-b073b1500865-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.634213 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/281b5fc0-7da4-4d5a-89d4-b073b1500865-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.635782 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.635833 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-50ab6ab7-e7d1-4664-a405-e36dbc6d1dc7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50ab6ab7-e7d1-4664-a405-e36dbc6d1dc7\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0e98ca8a4306f45b0c207e0b22ce20c78efdfec2f28e0669d30e68d17890be1f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.650334 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/281b5fc0-7da4-4d5a-89d4-b073b1500865-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.651977 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zsf5\" (UniqueName: \"kubernetes.io/projected/281b5fc0-7da4-4d5a-89d4-b073b1500865-kube-api-access-4zsf5\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.686304 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-50ab6ab7-e7d1-4664-a405-e36dbc6d1dc7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50ab6ab7-e7d1-4664-a405-e36dbc6d1dc7\") pod \"prometheus-metric-storage-0\" (UID: \"281b5fc0-7da4-4d5a-89d4-b073b1500865\") " pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.961973 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 19 23:07:36 crc kubenswrapper[4795]: I0219 23:07:36.991371 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.334593 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.400705 4795 generic.go:334] "Generic (PLEG): container finished" podID="54e90f84-703c-41b3-85c2-dd4ce9e3a968" containerID="3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd" exitCode=137 Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.400783 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.400816 4795 scope.go:117] "RemoveContainer" containerID="3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.402940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3","Type":"ContainerStarted","Data":"91039ea6f3bcf8ab62a299a40c821405c3ef84455f4d7cafb422b0ca09dbe4d0"} Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.407961 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f1d06b1e-9114-47b8-913d-86144f6314c3","Type":"ContainerStarted","Data":"5f9b3bfde2c60041dc2613027af3cb7a93b83c6359ae775ac8a5b59a4fbe841d"} Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.424254 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config\") pod \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.424619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config-secret\") pod \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.424847 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sfjd\" (UniqueName: \"kubernetes.io/projected/54e90f84-703c-41b3-85c2-dd4ce9e3a968-kube-api-access-8sfjd\") pod \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\" (UID: \"54e90f84-703c-41b3-85c2-dd4ce9e3a968\") " Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.436338 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54e90f84-703c-41b3-85c2-dd4ce9e3a968-kube-api-access-8sfjd" (OuterVolumeSpecName: "kube-api-access-8sfjd") pod "54e90f84-703c-41b3-85c2-dd4ce9e3a968" (UID: "54e90f84-703c-41b3-85c2-dd4ce9e3a968"). InnerVolumeSpecName "kube-api-access-8sfjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.477641 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "54e90f84-703c-41b3-85c2-dd4ce9e3a968" (UID: "54e90f84-703c-41b3-85c2-dd4ce9e3a968"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.494715 4795 scope.go:117] "RemoveContainer" containerID="3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd" Feb 19 23:07:37 crc kubenswrapper[4795]: E0219 23:07:37.495699 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd\": container with ID starting with 3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd not found: ID does not exist" containerID="3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.495735 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd"} err="failed to get container status \"3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd\": rpc error: code = NotFound desc = could not find container \"3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd\": container with ID starting with 3e0c8bd8a5e5ff3ecf9344765c7baeaf282ae456818a7075cdf1b64ce1d085bd not found: ID does not exist" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.518817 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "54e90f84-703c-41b3-85c2-dd4ce9e3a968" (UID: "54e90f84-703c-41b3-85c2-dd4ce9e3a968"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.524780 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54e90f84-703c-41b3-85c2-dd4ce9e3a968" path="/var/lib/kubelet/pods/54e90f84-703c-41b3-85c2-dd4ce9e3a968/volumes" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.530568 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.530602 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54e90f84-703c-41b3-85c2-dd4ce9e3a968-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.530613 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sfjd\" (UniqueName: \"kubernetes.io/projected/54e90f84-703c-41b3-85c2-dd4ce9e3a968-kube-api-access-8sfjd\") on node \"crc\" DevicePath \"\"" Feb 19 23:07:37 crc kubenswrapper[4795]: I0219 23:07:37.617883 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 23:07:38 crc kubenswrapper[4795]: I0219 23:07:38.421262 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5a2a47de-c40d-40c9-8556-ea7033a4033b","Type":"ContainerStarted","Data":"1d898d6c38d43a61747bc511598e0c38b71f457124d422748f7fc0a5fd54852b"} Feb 19 23:07:38 crc kubenswrapper[4795]: I0219 23:07:38.421448 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 23:07:38 crc kubenswrapper[4795]: I0219 23:07:38.423122 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"281b5fc0-7da4-4d5a-89d4-b073b1500865","Type":"ContainerStarted","Data":"04c67ed8f0cf7f1b0e556d1df003bd0033d604fd36cf7640910b6dc261b03a92"} Feb 19 23:07:38 crc kubenswrapper[4795]: I0219 23:07:38.425335 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f1d06b1e-9114-47b8-913d-86144f6314c3","Type":"ContainerStarted","Data":"54c3c15f47eb5e449e2d5319739718bd7136c1784750f7f835b55b200ef136a6"} Feb 19 23:07:38 crc kubenswrapper[4795]: I0219 23:07:38.443661 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.80503997 podStartE2EDuration="4.443642644s" podCreationTimestamp="2026-02-19 23:07:34 +0000 UTC" firstStartedPulling="2026-02-19 23:07:36.130818326 +0000 UTC m=+5967.323336190" lastFinishedPulling="2026-02-19 23:07:36.769421 +0000 UTC m=+5967.961938864" observedRunningTime="2026-02-19 23:07:38.43897002 +0000 UTC m=+5969.631487884" watchObservedRunningTime="2026-02-19 23:07:38.443642644 +0000 UTC m=+5969.636160508" Feb 19 23:07:38 crc kubenswrapper[4795]: I0219 23:07:38.461425 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.4614032869999996 podStartE2EDuration="4.461403287s" podCreationTimestamp="2026-02-19 23:07:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:07:38.455082989 +0000 UTC m=+5969.647600853" watchObservedRunningTime="2026-02-19 23:07:38.461403287 +0000 UTC m=+5969.653921141" Feb 19 23:07:42 crc kubenswrapper[4795]: I0219 23:07:42.468774 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"281b5fc0-7da4-4d5a-89d4-b073b1500865","Type":"ContainerStarted","Data":"95b763a5281f060aa043098421142c64a5a0e1f503f589832c897f8922788c08"} Feb 19 23:07:44 crc kubenswrapper[4795]: I0219 23:07:44.489989 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3","Type":"ContainerStarted","Data":"33132c914d6072c08b89eaca99297423ffe42a3393bab4b9400028bc91539b90"} Feb 19 23:07:45 crc kubenswrapper[4795]: I0219 23:07:45.208304 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 23:07:50 crc kubenswrapper[4795]: I0219 23:07:50.560105 4795 generic.go:334] "Generic (PLEG): container finished" podID="b5b60b6d-7ecf-424d-a297-f98fae5ef0a3" containerID="33132c914d6072c08b89eaca99297423ffe42a3393bab4b9400028bc91539b90" exitCode=0 Feb 19 23:07:50 crc kubenswrapper[4795]: I0219 23:07:50.560222 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3","Type":"ContainerDied","Data":"33132c914d6072c08b89eaca99297423ffe42a3393bab4b9400028bc91539b90"} Feb 19 23:07:50 crc kubenswrapper[4795]: I0219 23:07:50.564383 4795 generic.go:334] "Generic (PLEG): container finished" podID="281b5fc0-7da4-4d5a-89d4-b073b1500865" containerID="95b763a5281f060aa043098421142c64a5a0e1f503f589832c897f8922788c08" exitCode=0 Feb 19 23:07:50 crc kubenswrapper[4795]: I0219 23:07:50.564475 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"281b5fc0-7da4-4d5a-89d4-b073b1500865","Type":"ContainerDied","Data":"95b763a5281f060aa043098421142c64a5a0e1f503f589832c897f8922788c08"} Feb 19 23:07:53 crc kubenswrapper[4795]: I0219 23:07:53.601053 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3","Type":"ContainerStarted","Data":"035dced6dc1d8c609f3f4f83f0e988bba429ccd5ba721ef5396e1f2e114bdb62"} Feb 19 23:07:56 crc kubenswrapper[4795]: I0219 23:07:56.627079 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"281b5fc0-7da4-4d5a-89d4-b073b1500865","Type":"ContainerStarted","Data":"460ba294f22d0be78bd36903159370630e46c54d8b61d72126070926df32cbbf"} Feb 19 23:07:56 crc kubenswrapper[4795]: I0219 23:07:56.630750 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b5b60b6d-7ecf-424d-a297-f98fae5ef0a3","Type":"ContainerStarted","Data":"64ab6545c5dbcb6ef2aa8c68dc95e4d32067b0113778a132e51b79ca3dde9d3e"} Feb 19 23:07:56 crc kubenswrapper[4795]: I0219 23:07:56.630983 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:56 crc kubenswrapper[4795]: I0219 23:07:56.634696 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 19 23:07:56 crc kubenswrapper[4795]: I0219 23:07:56.654463 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=5.550604469 podStartE2EDuration="21.654444062s" podCreationTimestamp="2026-02-19 23:07:35 +0000 UTC" firstStartedPulling="2026-02-19 23:07:36.994818085 +0000 UTC m=+5968.187335949" lastFinishedPulling="2026-02-19 23:07:53.098657678 +0000 UTC m=+5984.291175542" observedRunningTime="2026-02-19 23:07:56.64984637 +0000 UTC m=+5987.842364234" watchObservedRunningTime="2026-02-19 23:07:56.654444062 +0000 UTC m=+5987.846961926" Feb 19 23:07:59 crc kubenswrapper[4795]: I0219 23:07:59.662401 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"281b5fc0-7da4-4d5a-89d4-b073b1500865","Type":"ContainerStarted","Data":"c11350d4307defa8241541c8c4e814d6c6de396113561f904b3198d0aad466ed"} Feb 19 23:08:04 crc kubenswrapper[4795]: I0219 23:08:04.721122 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"281b5fc0-7da4-4d5a-89d4-b073b1500865","Type":"ContainerStarted","Data":"a464a72fbdb723c6703282b6c1410035b218194056585f1670956cbf17f81b82"} Feb 19 23:08:04 crc kubenswrapper[4795]: I0219 23:08:04.756486 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.517215225 podStartE2EDuration="29.756461229s" podCreationTimestamp="2026-02-19 23:07:35 +0000 UTC" firstStartedPulling="2026-02-19 23:07:37.626192085 +0000 UTC m=+5968.818709949" lastFinishedPulling="2026-02-19 23:08:03.865438089 +0000 UTC m=+5995.057955953" observedRunningTime="2026-02-19 23:08:04.746239696 +0000 UTC m=+5995.938757550" watchObservedRunningTime="2026-02-19 23:08:04.756461229 +0000 UTC m=+5995.948979113" Feb 19 23:08:06 crc kubenswrapper[4795]: I0219 23:08:06.991808 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 23:08:06 crc kubenswrapper[4795]: I0219 23:08:06.992393 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 23:08:06 crc kubenswrapper[4795]: I0219 23:08:06.996285 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 23:08:07 crc kubenswrapper[4795]: I0219 23:08:07.750436 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.743910 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.748346 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.751172 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.751180 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.767134 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.785263 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-config-data\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.785397 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbt7n\" (UniqueName: \"kubernetes.io/projected/ed07c74b-0313-43e7-a031-024966ef2734-kube-api-access-hbt7n\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.785481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.785585 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-scripts\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.785611 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-run-httpd\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.785707 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.785744 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-log-httpd\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.887036 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-config-data\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.887108 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbt7n\" (UniqueName: \"kubernetes.io/projected/ed07c74b-0313-43e7-a031-024966ef2734-kube-api-access-hbt7n\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.887152 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.887214 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-scripts\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.887230 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-run-httpd\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.887266 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.887283 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-log-httpd\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.887698 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-log-httpd\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.889025 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-run-httpd\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.892920 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-scripts\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.893020 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.894397 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-config-data\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.909822 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:08 crc kubenswrapper[4795]: I0219 23:08:08.917838 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbt7n\" (UniqueName: \"kubernetes.io/projected/ed07c74b-0313-43e7-a031-024966ef2734-kube-api-access-hbt7n\") pod \"ceilometer-0\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " pod="openstack/ceilometer-0" Feb 19 23:08:09 crc kubenswrapper[4795]: I0219 23:08:09.068178 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:08:09 crc kubenswrapper[4795]: I0219 23:08:09.567337 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:09 crc kubenswrapper[4795]: I0219 23:08:09.774375 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerStarted","Data":"92493a20710869063a50df759376e4f92368ef4022d75d392f1e23b18e170ed8"} Feb 19 23:08:10 crc kubenswrapper[4795]: I0219 23:08:10.785915 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerStarted","Data":"34b6d357921348772dce07cbb59f4cb709626c264269c9f5fbc76d546a352213"} Feb 19 23:08:11 crc kubenswrapper[4795]: I0219 23:08:11.799830 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerStarted","Data":"793df9462bcfd2f1844bb37dfd196c29bbd8324d8e35261f3d76e625871728fa"} Feb 19 23:08:12 crc kubenswrapper[4795]: I0219 23:08:12.811510 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerStarted","Data":"cb6a9b96dedb3f21e3e969a05bf643085cb53bcccf6ec35d67ad3e864971ea31"} Feb 19 23:08:14 crc kubenswrapper[4795]: I0219 23:08:14.837451 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerStarted","Data":"887744f5aee90def91c51a51f9bae27976967caa200c8c26baf9917953c0a305"} Feb 19 23:08:14 crc kubenswrapper[4795]: I0219 23:08:14.838390 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 23:08:14 crc kubenswrapper[4795]: I0219 23:08:14.868663 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.262240424 podStartE2EDuration="6.86863768s" podCreationTimestamp="2026-02-19 23:08:08 +0000 UTC" firstStartedPulling="2026-02-19 23:08:09.578188649 +0000 UTC m=+6000.770706513" lastFinishedPulling="2026-02-19 23:08:14.184585905 +0000 UTC m=+6005.377103769" observedRunningTime="2026-02-19 23:08:14.858357826 +0000 UTC m=+6006.050875690" watchObservedRunningTime="2026-02-19 23:08:14.86863768 +0000 UTC m=+6006.061155554" Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.047681 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-qp45n"] Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.056670 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c94f-account-create-update-rqkwd"] Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.065779 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-x75sd"] Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.074982 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c94f-account-create-update-rqkwd"] Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.084236 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wj996"] Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.093268 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-qp45n"] Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.102208 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-x75sd"] Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.110869 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wj996"] Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.538510 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41276d39-878a-4ed2-879b-2a053340874e" path="/var/lib/kubelet/pods/41276d39-878a-4ed2-879b-2a053340874e/volumes" Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.539926 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f811fa4-8fb3-4adc-a9a8-6539dc03494c" path="/var/lib/kubelet/pods/6f811fa4-8fb3-4adc-a9a8-6539dc03494c/volumes" Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.541581 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd4e5010-15f4-499e-8279-9a1b814b5490" path="/var/lib/kubelet/pods/bd4e5010-15f4-499e-8279-9a1b814b5490/volumes" Feb 19 23:08:17 crc kubenswrapper[4795]: I0219 23:08:17.542885 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c14f4993-80e4-4fbf-a719-22f17750811b" path="/var/lib/kubelet/pods/c14f4993-80e4-4fbf-a719-22f17750811b/volumes" Feb 19 23:08:18 crc kubenswrapper[4795]: I0219 23:08:18.035071 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-72a3-account-create-update-bfhbs"] Feb 19 23:08:18 crc kubenswrapper[4795]: I0219 23:08:18.048131 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-93a1-account-create-update-fsvms"] Feb 19 23:08:18 crc kubenswrapper[4795]: I0219 23:08:18.062338 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-93a1-account-create-update-fsvms"] Feb 19 23:08:18 crc kubenswrapper[4795]: I0219 23:08:18.077201 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-72a3-account-create-update-bfhbs"] Feb 19 23:08:19 crc kubenswrapper[4795]: I0219 23:08:19.528082 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32478c4a-a97f-4fd3-84f0-a3c221beefe9" path="/var/lib/kubelet/pods/32478c4a-a97f-4fd3-84f0-a3c221beefe9/volumes" Feb 19 23:08:19 crc kubenswrapper[4795]: I0219 23:08:19.528959 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6f38e11-ea05-447d-8564-117c0f589d88" path="/var/lib/kubelet/pods/b6f38e11-ea05-447d-8564-117c0f589d88/volumes" Feb 19 23:08:20 crc kubenswrapper[4795]: I0219 23:08:20.985958 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-jll2l"] Feb 19 23:08:20 crc kubenswrapper[4795]: I0219 23:08:20.989318 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:20 crc kubenswrapper[4795]: I0219 23:08:20.994338 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-jll2l"] Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.101972 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-aa86-account-create-update-n8xdq"] Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.103682 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.106667 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.119069 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-aa86-account-create-update-n8xdq"] Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.133947 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8p4w\" (UniqueName: \"kubernetes.io/projected/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-kube-api-access-z8p4w\") pod \"aodh-db-create-jll2l\" (UID: \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\") " pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.134128 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-operator-scripts\") pod \"aodh-db-create-jll2l\" (UID: \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\") " pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.239374 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-operator-scripts\") pod \"aodh-db-create-jll2l\" (UID: \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\") " pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.239548 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8p4w\" (UniqueName: \"kubernetes.io/projected/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-kube-api-access-z8p4w\") pod \"aodh-db-create-jll2l\" (UID: \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\") " pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.239594 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3086733-54e4-4041-9896-88f6df519492-operator-scripts\") pod \"aodh-aa86-account-create-update-n8xdq\" (UID: \"d3086733-54e4-4041-9896-88f6df519492\") " pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.239656 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mpb5\" (UniqueName: \"kubernetes.io/projected/d3086733-54e4-4041-9896-88f6df519492-kube-api-access-6mpb5\") pod \"aodh-aa86-account-create-update-n8xdq\" (UID: \"d3086733-54e4-4041-9896-88f6df519492\") " pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.240137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-operator-scripts\") pod \"aodh-db-create-jll2l\" (UID: \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\") " pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.265440 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8p4w\" (UniqueName: \"kubernetes.io/projected/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-kube-api-access-z8p4w\") pod \"aodh-db-create-jll2l\" (UID: \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\") " pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.341347 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3086733-54e4-4041-9896-88f6df519492-operator-scripts\") pod \"aodh-aa86-account-create-update-n8xdq\" (UID: \"d3086733-54e4-4041-9896-88f6df519492\") " pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.341428 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mpb5\" (UniqueName: \"kubernetes.io/projected/d3086733-54e4-4041-9896-88f6df519492-kube-api-access-6mpb5\") pod \"aodh-aa86-account-create-update-n8xdq\" (UID: \"d3086733-54e4-4041-9896-88f6df519492\") " pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.342451 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3086733-54e4-4041-9896-88f6df519492-operator-scripts\") pod \"aodh-aa86-account-create-update-n8xdq\" (UID: \"d3086733-54e4-4041-9896-88f6df519492\") " pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.356891 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.364210 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mpb5\" (UniqueName: \"kubernetes.io/projected/d3086733-54e4-4041-9896-88f6df519492-kube-api-access-6mpb5\") pod \"aodh-aa86-account-create-update-n8xdq\" (UID: \"d3086733-54e4-4041-9896-88f6df519492\") " pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.422122 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:21 crc kubenswrapper[4795]: W0219 23:08:21.904845 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3ad8082_2f7c_4c51_ac6d_f6121f30d0c8.slice/crio-ea3c9853658bd62f59c739bcb14432fcbfeb30913ffcdb5f1d8e595caecd4713 WatchSource:0}: Error finding container ea3c9853658bd62f59c739bcb14432fcbfeb30913ffcdb5f1d8e595caecd4713: Status 404 returned error can't find the container with id ea3c9853658bd62f59c739bcb14432fcbfeb30913ffcdb5f1d8e595caecd4713 Feb 19 23:08:21 crc kubenswrapper[4795]: I0219 23:08:21.906786 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-jll2l"] Feb 19 23:08:22 crc kubenswrapper[4795]: I0219 23:08:22.017298 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-aa86-account-create-update-n8xdq"] Feb 19 23:08:22 crc kubenswrapper[4795]: W0219 23:08:22.018027 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3086733_54e4_4041_9896_88f6df519492.slice/crio-1092d0b418e78e9ed7e6f5759bd83db0728ea07945e94a26a13e157313e6ba59 WatchSource:0}: Error finding container 1092d0b418e78e9ed7e6f5759bd83db0728ea07945e94a26a13e157313e6ba59: Status 404 returned error can't find the container with id 1092d0b418e78e9ed7e6f5759bd83db0728ea07945e94a26a13e157313e6ba59 Feb 19 23:08:22 crc kubenswrapper[4795]: I0219 23:08:22.929839 4795 generic.go:334] "Generic (PLEG): container finished" podID="d3086733-54e4-4041-9896-88f6df519492" containerID="49e34c186890a85782e2cf1f05ffa268eb8918522484c1ad1e090793c43863fa" exitCode=0 Feb 19 23:08:22 crc kubenswrapper[4795]: I0219 23:08:22.929949 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-aa86-account-create-update-n8xdq" event={"ID":"d3086733-54e4-4041-9896-88f6df519492","Type":"ContainerDied","Data":"49e34c186890a85782e2cf1f05ffa268eb8918522484c1ad1e090793c43863fa"} Feb 19 23:08:22 crc kubenswrapper[4795]: I0219 23:08:22.930211 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-aa86-account-create-update-n8xdq" event={"ID":"d3086733-54e4-4041-9896-88f6df519492","Type":"ContainerStarted","Data":"1092d0b418e78e9ed7e6f5759bd83db0728ea07945e94a26a13e157313e6ba59"} Feb 19 23:08:22 crc kubenswrapper[4795]: I0219 23:08:22.931777 4795 generic.go:334] "Generic (PLEG): container finished" podID="d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8" containerID="6653ee424e57053f6b0308afd986889a06d2949741f5136682c4b14068b5b724" exitCode=0 Feb 19 23:08:22 crc kubenswrapper[4795]: I0219 23:08:22.931837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jll2l" event={"ID":"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8","Type":"ContainerDied","Data":"6653ee424e57053f6b0308afd986889a06d2949741f5136682c4b14068b5b724"} Feb 19 23:08:22 crc kubenswrapper[4795]: I0219 23:08:22.931864 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jll2l" event={"ID":"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8","Type":"ContainerStarted","Data":"ea3c9853658bd62f59c739bcb14432fcbfeb30913ffcdb5f1d8e595caecd4713"} Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.353513 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.360039 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.408857 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3086733-54e4-4041-9896-88f6df519492-operator-scripts\") pod \"d3086733-54e4-4041-9896-88f6df519492\" (UID: \"d3086733-54e4-4041-9896-88f6df519492\") " Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.409018 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mpb5\" (UniqueName: \"kubernetes.io/projected/d3086733-54e4-4041-9896-88f6df519492-kube-api-access-6mpb5\") pod \"d3086733-54e4-4041-9896-88f6df519492\" (UID: \"d3086733-54e4-4041-9896-88f6df519492\") " Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.409435 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3086733-54e4-4041-9896-88f6df519492-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3086733-54e4-4041-9896-88f6df519492" (UID: "d3086733-54e4-4041-9896-88f6df519492"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.409908 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3086733-54e4-4041-9896-88f6df519492-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.417395 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3086733-54e4-4041-9896-88f6df519492-kube-api-access-6mpb5" (OuterVolumeSpecName: "kube-api-access-6mpb5") pod "d3086733-54e4-4041-9896-88f6df519492" (UID: "d3086733-54e4-4041-9896-88f6df519492"). InnerVolumeSpecName "kube-api-access-6mpb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.511750 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-operator-scripts\") pod \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\" (UID: \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\") " Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.511893 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8p4w\" (UniqueName: \"kubernetes.io/projected/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-kube-api-access-z8p4w\") pod \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\" (UID: \"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8\") " Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.512549 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8" (UID: "d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.512816 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mpb5\" (UniqueName: \"kubernetes.io/projected/d3086733-54e4-4041-9896-88f6df519492-kube-api-access-6mpb5\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.512835 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.515051 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-kube-api-access-z8p4w" (OuterVolumeSpecName: "kube-api-access-z8p4w") pod "d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8" (UID: "d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8"). InnerVolumeSpecName "kube-api-access-z8p4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.614501 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8p4w\" (UniqueName: \"kubernetes.io/projected/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8-kube-api-access-z8p4w\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.958787 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-aa86-account-create-update-n8xdq" event={"ID":"d3086733-54e4-4041-9896-88f6df519492","Type":"ContainerDied","Data":"1092d0b418e78e9ed7e6f5759bd83db0728ea07945e94a26a13e157313e6ba59"} Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.958826 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1092d0b418e78e9ed7e6f5759bd83db0728ea07945e94a26a13e157313e6ba59" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.959122 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-aa86-account-create-update-n8xdq" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.960248 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jll2l" event={"ID":"d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8","Type":"ContainerDied","Data":"ea3c9853658bd62f59c739bcb14432fcbfeb30913ffcdb5f1d8e595caecd4713"} Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.960268 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea3c9853658bd62f59c739bcb14432fcbfeb30913ffcdb5f1d8e595caecd4713" Feb 19 23:08:24 crc kubenswrapper[4795]: I0219 23:08:24.960395 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jll2l" Feb 19 23:08:25 crc kubenswrapper[4795]: I0219 23:08:25.440728 4795 scope.go:117] "RemoveContainer" containerID="2ef66b4d7c2d7c4390b233ae4586dc431786dee8c34b27c0308b9e8e392e9643" Feb 19 23:08:25 crc kubenswrapper[4795]: I0219 23:08:25.466836 4795 scope.go:117] "RemoveContainer" containerID="4ee5bc336806db9668bb12f08ece22f6cb10a6fe341f3f2abd0e9309ef59d764" Feb 19 23:08:25 crc kubenswrapper[4795]: I0219 23:08:25.530099 4795 scope.go:117] "RemoveContainer" containerID="886bef23bb00ea44b033b4812e88d91f6efe4ec5933a69343ea4b9bc4fed7503" Feb 19 23:08:25 crc kubenswrapper[4795]: I0219 23:08:25.579263 4795 scope.go:117] "RemoveContainer" containerID="00e59b7398cdff96aaf8a625c69d45b06909964fc853d0d0c0e9c6a12321c27b" Feb 19 23:08:25 crc kubenswrapper[4795]: I0219 23:08:25.618825 4795 scope.go:117] "RemoveContainer" containerID="aa245a9fc5b7b00ccfbfeb6940d75331800d0a653cedd61f24d6b1162fb6f41d" Feb 19 23:08:25 crc kubenswrapper[4795]: I0219 23:08:25.666045 4795 scope.go:117] "RemoveContainer" containerID="0a09a6dbfb16ba30386d5f0a4f52128dc79107a79561ffd233f9f60e4e73100d" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.590738 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-q2lkk"] Feb 19 23:08:26 crc kubenswrapper[4795]: E0219 23:08:26.591664 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8" containerName="mariadb-database-create" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.591680 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8" containerName="mariadb-database-create" Feb 19 23:08:26 crc kubenswrapper[4795]: E0219 23:08:26.591701 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3086733-54e4-4041-9896-88f6df519492" containerName="mariadb-account-create-update" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.591709 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3086733-54e4-4041-9896-88f6df519492" containerName="mariadb-account-create-update" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.591946 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3086733-54e4-4041-9896-88f6df519492" containerName="mariadb-account-create-update" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.591978 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8" containerName="mariadb-database-create" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.593071 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.595091 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.595348 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.595547 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.595863 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-7c4hw" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.603501 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-q2lkk"] Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.683718 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4rc2\" (UniqueName: \"kubernetes.io/projected/e1953fbb-b558-497f-b889-62b41f35e4b4-kube-api-access-t4rc2\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.683802 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-scripts\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.683835 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-config-data\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.683971 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-combined-ca-bundle\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.786056 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-combined-ca-bundle\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.786193 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4rc2\" (UniqueName: \"kubernetes.io/projected/e1953fbb-b558-497f-b889-62b41f35e4b4-kube-api-access-t4rc2\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.786277 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-scripts\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.786317 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-config-data\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.792658 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-combined-ca-bundle\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.792985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-config-data\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.793282 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-scripts\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.803111 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4rc2\" (UniqueName: \"kubernetes.io/projected/e1953fbb-b558-497f-b889-62b41f35e4b4-kube-api-access-t4rc2\") pod \"aodh-db-sync-q2lkk\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:26 crc kubenswrapper[4795]: I0219 23:08:26.926810 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:27 crc kubenswrapper[4795]: I0219 23:08:27.047949 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9nmbd"] Feb 19 23:08:27 crc kubenswrapper[4795]: I0219 23:08:27.059948 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9nmbd"] Feb 19 23:08:27 crc kubenswrapper[4795]: I0219 23:08:27.400881 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-q2lkk"] Feb 19 23:08:27 crc kubenswrapper[4795]: W0219 23:08:27.403205 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1953fbb_b558_497f_b889_62b41f35e4b4.slice/crio-dfc73f4cc52325e3cfd8fb3d09922cc6d90931e71284877d05ba85bec5c67e8b WatchSource:0}: Error finding container dfc73f4cc52325e3cfd8fb3d09922cc6d90931e71284877d05ba85bec5c67e8b: Status 404 returned error can't find the container with id dfc73f4cc52325e3cfd8fb3d09922cc6d90931e71284877d05ba85bec5c67e8b Feb 19 23:08:27 crc kubenswrapper[4795]: I0219 23:08:27.523452 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="303b3e4f-4b2b-4071-b54d-fe4aec3f18f5" path="/var/lib/kubelet/pods/303b3e4f-4b2b-4071-b54d-fe4aec3f18f5/volumes" Feb 19 23:08:27 crc kubenswrapper[4795]: I0219 23:08:27.997850 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-q2lkk" event={"ID":"e1953fbb-b558-497f-b889-62b41f35e4b4","Type":"ContainerStarted","Data":"dfc73f4cc52325e3cfd8fb3d09922cc6d90931e71284877d05ba85bec5c67e8b"} Feb 19 23:08:31 crc kubenswrapper[4795]: I0219 23:08:31.039235 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-q2lkk" event={"ID":"e1953fbb-b558-497f-b889-62b41f35e4b4","Type":"ContainerStarted","Data":"47bda409c5d13cad14cf54811039c8544c7b3358bb1de45385c9a60e61a75704"} Feb 19 23:08:31 crc kubenswrapper[4795]: I0219 23:08:31.055731 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-q2lkk" podStartSLOduration=1.680619766 podStartE2EDuration="5.055715838s" podCreationTimestamp="2026-02-19 23:08:26 +0000 UTC" firstStartedPulling="2026-02-19 23:08:27.405362644 +0000 UTC m=+6018.597880508" lastFinishedPulling="2026-02-19 23:08:30.780458706 +0000 UTC m=+6021.972976580" observedRunningTime="2026-02-19 23:08:31.051622609 +0000 UTC m=+6022.244140463" watchObservedRunningTime="2026-02-19 23:08:31.055715838 +0000 UTC m=+6022.248233702" Feb 19 23:08:33 crc kubenswrapper[4795]: I0219 23:08:33.061216 4795 generic.go:334] "Generic (PLEG): container finished" podID="e1953fbb-b558-497f-b889-62b41f35e4b4" containerID="47bda409c5d13cad14cf54811039c8544c7b3358bb1de45385c9a60e61a75704" exitCode=0 Feb 19 23:08:33 crc kubenswrapper[4795]: I0219 23:08:33.061326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-q2lkk" event={"ID":"e1953fbb-b558-497f-b889-62b41f35e4b4","Type":"ContainerDied","Data":"47bda409c5d13cad14cf54811039c8544c7b3358bb1de45385c9a60e61a75704"} Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.485151 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.554974 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-config-data\") pod \"e1953fbb-b558-497f-b889-62b41f35e4b4\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.555754 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-combined-ca-bundle\") pod \"e1953fbb-b558-497f-b889-62b41f35e4b4\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.560258 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4rc2\" (UniqueName: \"kubernetes.io/projected/e1953fbb-b558-497f-b889-62b41f35e4b4-kube-api-access-t4rc2\") pod \"e1953fbb-b558-497f-b889-62b41f35e4b4\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.560366 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-scripts\") pod \"e1953fbb-b558-497f-b889-62b41f35e4b4\" (UID: \"e1953fbb-b558-497f-b889-62b41f35e4b4\") " Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.605372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1953fbb-b558-497f-b889-62b41f35e4b4-kube-api-access-t4rc2" (OuterVolumeSpecName: "kube-api-access-t4rc2") pod "e1953fbb-b558-497f-b889-62b41f35e4b4" (UID: "e1953fbb-b558-497f-b889-62b41f35e4b4"). InnerVolumeSpecName "kube-api-access-t4rc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.629372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-scripts" (OuterVolumeSpecName: "scripts") pod "e1953fbb-b558-497f-b889-62b41f35e4b4" (UID: "e1953fbb-b558-497f-b889-62b41f35e4b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.669495 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4rc2\" (UniqueName: \"kubernetes.io/projected/e1953fbb-b558-497f-b889-62b41f35e4b4-kube-api-access-t4rc2\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.669522 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.682316 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-config-data" (OuterVolumeSpecName: "config-data") pod "e1953fbb-b558-497f-b889-62b41f35e4b4" (UID: "e1953fbb-b558-497f-b889-62b41f35e4b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.716341 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1953fbb-b558-497f-b889-62b41f35e4b4" (UID: "e1953fbb-b558-497f-b889-62b41f35e4b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.770654 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:34 crc kubenswrapper[4795]: I0219 23:08:34.770876 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1953fbb-b558-497f-b889-62b41f35e4b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:35 crc kubenswrapper[4795]: I0219 23:08:35.079523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-q2lkk" event={"ID":"e1953fbb-b558-497f-b889-62b41f35e4b4","Type":"ContainerDied","Data":"dfc73f4cc52325e3cfd8fb3d09922cc6d90931e71284877d05ba85bec5c67e8b"} Feb 19 23:08:35 crc kubenswrapper[4795]: I0219 23:08:35.079563 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfc73f4cc52325e3cfd8fb3d09922cc6d90931e71284877d05ba85bec5c67e8b" Feb 19 23:08:35 crc kubenswrapper[4795]: I0219 23:08:35.079618 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-q2lkk" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.040072 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Feb 19 23:08:36 crc kubenswrapper[4795]: E0219 23:08:36.041188 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1953fbb-b558-497f-b889-62b41f35e4b4" containerName="aodh-db-sync" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.041211 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1953fbb-b558-497f-b889-62b41f35e4b4" containerName="aodh-db-sync" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.041628 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1953fbb-b558-497f-b889-62b41f35e4b4" containerName="aodh-db-sync" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.045204 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.047333 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.047906 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.052615 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.055717 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-7c4hw" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.096492 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-combined-ca-bundle\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.096695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-config-data\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.096726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4mnq\" (UniqueName: \"kubernetes.io/projected/acb98719-7401-4241-8361-070eb67879c7-kube-api-access-p4mnq\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.096845 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-scripts\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.198572 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-scripts\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.198670 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-combined-ca-bundle\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.198758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-config-data\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.198775 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4mnq\" (UniqueName: \"kubernetes.io/projected/acb98719-7401-4241-8361-070eb67879c7-kube-api-access-p4mnq\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.213220 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-config-data\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.213257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-combined-ca-bundle\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.214344 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acb98719-7401-4241-8361-070eb67879c7-scripts\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.228010 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4mnq\" (UniqueName: \"kubernetes.io/projected/acb98719-7401-4241-8361-070eb67879c7-kube-api-access-p4mnq\") pod \"aodh-0\" (UID: \"acb98719-7401-4241-8361-070eb67879c7\") " pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.364944 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 19 23:08:36 crc kubenswrapper[4795]: I0219 23:08:36.882062 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 19 23:08:37 crc kubenswrapper[4795]: I0219 23:08:37.100621 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"acb98719-7401-4241-8361-070eb67879c7","Type":"ContainerStarted","Data":"b252e6ed8510a7cc2263a5cc42510d0a516ac8a803923d4988abacc34acf10da"} Feb 19 23:08:38 crc kubenswrapper[4795]: I0219 23:08:38.119887 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"acb98719-7401-4241-8361-070eb67879c7","Type":"ContainerStarted","Data":"7e8468251057892f31f06e7a69a5f13721746fb21c7868cf95003c9fe84705bf"} Feb 19 23:08:38 crc kubenswrapper[4795]: I0219 23:08:38.171996 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:38 crc kubenswrapper[4795]: I0219 23:08:38.172371 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="ceilometer-central-agent" containerID="cri-o://34b6d357921348772dce07cbb59f4cb709626c264269c9f5fbc76d546a352213" gracePeriod=30 Feb 19 23:08:38 crc kubenswrapper[4795]: I0219 23:08:38.172569 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="sg-core" containerID="cri-o://cb6a9b96dedb3f21e3e969a05bf643085cb53bcccf6ec35d67ad3e864971ea31" gracePeriod=30 Feb 19 23:08:38 crc kubenswrapper[4795]: I0219 23:08:38.172743 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="proxy-httpd" containerID="cri-o://887744f5aee90def91c51a51f9bae27976967caa200c8c26baf9917953c0a305" gracePeriod=30 Feb 19 23:08:38 crc kubenswrapper[4795]: I0219 23:08:38.172808 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="ceilometer-notification-agent" containerID="cri-o://793df9462bcfd2f1844bb37dfd196c29bbd8324d8e35261f3d76e625871728fa" gracePeriod=30 Feb 19 23:08:38 crc kubenswrapper[4795]: I0219 23:08:38.183968 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.133:3000/\": EOF" Feb 19 23:08:39 crc kubenswrapper[4795]: I0219 23:08:39.069081 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.133:3000/\": dial tcp 10.217.1.133:3000: connect: connection refused" Feb 19 23:08:39 crc kubenswrapper[4795]: I0219 23:08:39.146587 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed07c74b-0313-43e7-a031-024966ef2734" containerID="887744f5aee90def91c51a51f9bae27976967caa200c8c26baf9917953c0a305" exitCode=0 Feb 19 23:08:39 crc kubenswrapper[4795]: I0219 23:08:39.146927 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed07c74b-0313-43e7-a031-024966ef2734" containerID="cb6a9b96dedb3f21e3e969a05bf643085cb53bcccf6ec35d67ad3e864971ea31" exitCode=2 Feb 19 23:08:39 crc kubenswrapper[4795]: I0219 23:08:39.146935 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed07c74b-0313-43e7-a031-024966ef2734" containerID="34b6d357921348772dce07cbb59f4cb709626c264269c9f5fbc76d546a352213" exitCode=0 Feb 19 23:08:39 crc kubenswrapper[4795]: I0219 23:08:39.146728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerDied","Data":"887744f5aee90def91c51a51f9bae27976967caa200c8c26baf9917953c0a305"} Feb 19 23:08:39 crc kubenswrapper[4795]: I0219 23:08:39.146972 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerDied","Data":"cb6a9b96dedb3f21e3e969a05bf643085cb53bcccf6ec35d67ad3e864971ea31"} Feb 19 23:08:39 crc kubenswrapper[4795]: I0219 23:08:39.146986 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerDied","Data":"34b6d357921348772dce07cbb59f4cb709626c264269c9f5fbc76d546a352213"} Feb 19 23:08:40 crc kubenswrapper[4795]: I0219 23:08:40.168193 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"acb98719-7401-4241-8361-070eb67879c7","Type":"ContainerStarted","Data":"170a213f21aa36634496b5342c45671959ff7f7b76474d1dc214499f846811df"} Feb 19 23:08:41 crc kubenswrapper[4795]: I0219 23:08:41.177750 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"acb98719-7401-4241-8361-070eb67879c7","Type":"ContainerStarted","Data":"6788bc52af3b2c8a2628b250d81f0e5ca90417eb4d2797555b72ee77ae358ffb"} Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.190388 4795 generic.go:334] "Generic (PLEG): container finished" podID="ed07c74b-0313-43e7-a031-024966ef2734" containerID="793df9462bcfd2f1844bb37dfd196c29bbd8324d8e35261f3d76e625871728fa" exitCode=0 Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.190479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerDied","Data":"793df9462bcfd2f1844bb37dfd196c29bbd8324d8e35261f3d76e625871728fa"} Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.673330 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.756776 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-run-httpd\") pod \"ed07c74b-0313-43e7-a031-024966ef2734\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.757066 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-config-data\") pod \"ed07c74b-0313-43e7-a031-024966ef2734\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.757100 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-log-httpd\") pod \"ed07c74b-0313-43e7-a031-024966ef2734\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.757135 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-sg-core-conf-yaml\") pod \"ed07c74b-0313-43e7-a031-024966ef2734\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.757176 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbt7n\" (UniqueName: \"kubernetes.io/projected/ed07c74b-0313-43e7-a031-024966ef2734-kube-api-access-hbt7n\") pod \"ed07c74b-0313-43e7-a031-024966ef2734\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.757202 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-scripts\") pod \"ed07c74b-0313-43e7-a031-024966ef2734\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.757258 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-combined-ca-bundle\") pod \"ed07c74b-0313-43e7-a031-024966ef2734\" (UID: \"ed07c74b-0313-43e7-a031-024966ef2734\") " Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.757672 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ed07c74b-0313-43e7-a031-024966ef2734" (UID: "ed07c74b-0313-43e7-a031-024966ef2734"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.757897 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ed07c74b-0313-43e7-a031-024966ef2734" (UID: "ed07c74b-0313-43e7-a031-024966ef2734"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.763082 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-scripts" (OuterVolumeSpecName: "scripts") pod "ed07c74b-0313-43e7-a031-024966ef2734" (UID: "ed07c74b-0313-43e7-a031-024966ef2734"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.765106 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed07c74b-0313-43e7-a031-024966ef2734-kube-api-access-hbt7n" (OuterVolumeSpecName: "kube-api-access-hbt7n") pod "ed07c74b-0313-43e7-a031-024966ef2734" (UID: "ed07c74b-0313-43e7-a031-024966ef2734"). InnerVolumeSpecName "kube-api-access-hbt7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.794208 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ed07c74b-0313-43e7-a031-024966ef2734" (UID: "ed07c74b-0313-43e7-a031-024966ef2734"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.834598 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed07c74b-0313-43e7-a031-024966ef2734" (UID: "ed07c74b-0313-43e7-a031-024966ef2734"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.859415 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.859456 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.859471 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbt7n\" (UniqueName: \"kubernetes.io/projected/ed07c74b-0313-43e7-a031-024966ef2734-kube-api-access-hbt7n\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.859486 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.859499 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.859510 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed07c74b-0313-43e7-a031-024966ef2734-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.861127 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-config-data" (OuterVolumeSpecName: "config-data") pod "ed07c74b-0313-43e7-a031-024966ef2734" (UID: "ed07c74b-0313-43e7-a031-024966ef2734"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:08:42 crc kubenswrapper[4795]: I0219 23:08:42.962149 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed07c74b-0313-43e7-a031-024966ef2734-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.201438 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed07c74b-0313-43e7-a031-024966ef2734","Type":"ContainerDied","Data":"92493a20710869063a50df759376e4f92368ef4022d75d392f1e23b18e170ed8"} Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.201479 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.201537 4795 scope.go:117] "RemoveContainer" containerID="887744f5aee90def91c51a51f9bae27976967caa200c8c26baf9917953c0a305" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.207190 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"acb98719-7401-4241-8361-070eb67879c7","Type":"ContainerStarted","Data":"64a671ad808d96934ccf0bf7b3291555f64cee0d7425b8a291a73bf4f07607d7"} Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.242016 4795 scope.go:117] "RemoveContainer" containerID="cb6a9b96dedb3f21e3e969a05bf643085cb53bcccf6ec35d67ad3e864971ea31" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.262842 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.83291366 podStartE2EDuration="7.262815244s" podCreationTimestamp="2026-02-19 23:08:36 +0000 UTC" firstStartedPulling="2026-02-19 23:08:36.885680243 +0000 UTC m=+6028.078198117" lastFinishedPulling="2026-02-19 23:08:42.315581837 +0000 UTC m=+6033.508099701" observedRunningTime="2026-02-19 23:08:43.23300613 +0000 UTC m=+6034.425523994" watchObservedRunningTime="2026-02-19 23:08:43.262815244 +0000 UTC m=+6034.455333108" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.287922 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.318489 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.337031 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:43 crc kubenswrapper[4795]: E0219 23:08:43.337678 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="proxy-httpd" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.337704 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="proxy-httpd" Feb 19 23:08:43 crc kubenswrapper[4795]: E0219 23:08:43.337733 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="ceilometer-notification-agent" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.337742 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="ceilometer-notification-agent" Feb 19 23:08:43 crc kubenswrapper[4795]: E0219 23:08:43.337756 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="ceilometer-central-agent" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.337767 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="ceilometer-central-agent" Feb 19 23:08:43 crc kubenswrapper[4795]: E0219 23:08:43.337783 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="sg-core" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.337791 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="sg-core" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.338078 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="ceilometer-central-agent" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.338107 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="ceilometer-notification-agent" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.338132 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="sg-core" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.338145 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed07c74b-0313-43e7-a031-024966ef2734" containerName="proxy-httpd" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.340713 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.343595 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.344021 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.361549 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.376306 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-config-data\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.376429 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.376456 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-scripts\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.376506 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.376569 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-run-httpd\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.376767 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-log-httpd\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.376849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stfh6\" (UniqueName: \"kubernetes.io/projected/d7d7882a-86ad-4b48-8280-af2ced7c6807-kube-api-access-stfh6\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.377456 4795 scope.go:117] "RemoveContainer" containerID="793df9462bcfd2f1844bb37dfd196c29bbd8324d8e35261f3d76e625871728fa" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.424487 4795 scope.go:117] "RemoveContainer" containerID="34b6d357921348772dce07cbb59f4cb709626c264269c9f5fbc76d546a352213" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.478948 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.479010 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-scripts\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.479051 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.479095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-run-httpd\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.480437 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-log-httpd\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.480478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stfh6\" (UniqueName: \"kubernetes.io/projected/d7d7882a-86ad-4b48-8280-af2ced7c6807-kube-api-access-stfh6\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.480515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-config-data\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.481286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-run-httpd\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.482379 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-log-httpd\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.484249 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.484310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-scripts\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.486319 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.487984 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-config-data\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.500052 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stfh6\" (UniqueName: \"kubernetes.io/projected/d7d7882a-86ad-4b48-8280-af2ced7c6807-kube-api-access-stfh6\") pod \"ceilometer-0\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " pod="openstack/ceilometer-0" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.524668 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed07c74b-0313-43e7-a031-024966ef2734" path="/var/lib/kubelet/pods/ed07c74b-0313-43e7-a031-024966ef2734/volumes" Feb 19 23:08:43 crc kubenswrapper[4795]: I0219 23:08:43.670974 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:08:44 crc kubenswrapper[4795]: I0219 23:08:44.314315 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:08:45 crc kubenswrapper[4795]: I0219 23:08:45.048440 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7w2k2"] Feb 19 23:08:45 crc kubenswrapper[4795]: I0219 23:08:45.061485 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7w2k2"] Feb 19 23:08:45 crc kubenswrapper[4795]: I0219 23:08:45.244876 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerStarted","Data":"44a3b39a44dc6cf8dbe9434bdd68e38ca2c4f2db3b91cc4b6a6862525ce949fc"} Feb 19 23:08:45 crc kubenswrapper[4795]: I0219 23:08:45.244912 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerStarted","Data":"2b454f16774527928b9df78521f93b16c0055d0b08e28e664f0730245cc4e62b"} Feb 19 23:08:45 crc kubenswrapper[4795]: I0219 23:08:45.551469 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8065cb60-3c91-4fbc-89f1-7d73d11a85e5" path="/var/lib/kubelet/pods/8065cb60-3c91-4fbc-89f1-7d73d11a85e5/volumes" Feb 19 23:08:46 crc kubenswrapper[4795]: I0219 23:08:46.050089 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-cdffm"] Feb 19 23:08:46 crc kubenswrapper[4795]: I0219 23:08:46.067607 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-cdffm"] Feb 19 23:08:46 crc kubenswrapper[4795]: I0219 23:08:46.256524 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerStarted","Data":"807c413b53a5b1af9be38437a438f28044e30b65d6e5738bee5968445dc73a94"} Feb 19 23:08:46 crc kubenswrapper[4795]: I0219 23:08:46.256878 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerStarted","Data":"ca8c993b6a51e98c0db74360e21684626b8c5fc48a0421fcdb15ff26685a9b39"} Feb 19 23:08:47 crc kubenswrapper[4795]: I0219 23:08:47.524642 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7314a002-868e-4028-b341-b719a609e21c" path="/var/lib/kubelet/pods/7314a002-868e-4028-b341-b719a609e21c/volumes" Feb 19 23:08:48 crc kubenswrapper[4795]: I0219 23:08:48.296702 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerStarted","Data":"32823af9fc2a8e52e61f0605bdd2b17739e02e0d4106a20b34de910c13396fbb"} Feb 19 23:08:48 crc kubenswrapper[4795]: I0219 23:08:48.296975 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 23:08:48 crc kubenswrapper[4795]: I0219 23:08:48.334452 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.104368068 podStartE2EDuration="5.334424654s" podCreationTimestamp="2026-02-19 23:08:43 +0000 UTC" firstStartedPulling="2026-02-19 23:08:44.326134323 +0000 UTC m=+6035.518652187" lastFinishedPulling="2026-02-19 23:08:47.556190909 +0000 UTC m=+6038.748708773" observedRunningTime="2026-02-19 23:08:48.322920937 +0000 UTC m=+6039.515438811" watchObservedRunningTime="2026-02-19 23:08:48.334424654 +0000 UTC m=+6039.526942518" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.154570 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-4647-account-create-update-f7k78"] Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.156228 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.160381 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.168598 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-4647-account-create-update-f7k78"] Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.180151 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-657cv"] Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.185619 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-657cv" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.230674 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-operator-scripts\") pod \"manila-4647-account-create-update-f7k78\" (UID: \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\") " pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.230781 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr7xp\" (UniqueName: \"kubernetes.io/projected/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-kube-api-access-cr7xp\") pod \"manila-4647-account-create-update-f7k78\" (UID: \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\") " pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.236705 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-657cv"] Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.333466 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s52pg\" (UniqueName: \"kubernetes.io/projected/eadcaa3c-623a-409a-b735-2a38854c8036-kube-api-access-s52pg\") pod \"manila-db-create-657cv\" (UID: \"eadcaa3c-623a-409a-b735-2a38854c8036\") " pod="openstack/manila-db-create-657cv" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.333529 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-operator-scripts\") pod \"manila-4647-account-create-update-f7k78\" (UID: \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\") " pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.333586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eadcaa3c-623a-409a-b735-2a38854c8036-operator-scripts\") pod \"manila-db-create-657cv\" (UID: \"eadcaa3c-623a-409a-b735-2a38854c8036\") " pod="openstack/manila-db-create-657cv" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.333614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr7xp\" (UniqueName: \"kubernetes.io/projected/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-kube-api-access-cr7xp\") pod \"manila-4647-account-create-update-f7k78\" (UID: \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\") " pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.334736 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-operator-scripts\") pod \"manila-4647-account-create-update-f7k78\" (UID: \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\") " pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.375804 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr7xp\" (UniqueName: \"kubernetes.io/projected/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-kube-api-access-cr7xp\") pod \"manila-4647-account-create-update-f7k78\" (UID: \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\") " pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.435544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eadcaa3c-623a-409a-b735-2a38854c8036-operator-scripts\") pod \"manila-db-create-657cv\" (UID: \"eadcaa3c-623a-409a-b735-2a38854c8036\") " pod="openstack/manila-db-create-657cv" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.435715 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s52pg\" (UniqueName: \"kubernetes.io/projected/eadcaa3c-623a-409a-b735-2a38854c8036-kube-api-access-s52pg\") pod \"manila-db-create-657cv\" (UID: \"eadcaa3c-623a-409a-b735-2a38854c8036\") " pod="openstack/manila-db-create-657cv" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.437327 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eadcaa3c-623a-409a-b735-2a38854c8036-operator-scripts\") pod \"manila-db-create-657cv\" (UID: \"eadcaa3c-623a-409a-b735-2a38854c8036\") " pod="openstack/manila-db-create-657cv" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.453722 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s52pg\" (UniqueName: \"kubernetes.io/projected/eadcaa3c-623a-409a-b735-2a38854c8036-kube-api-access-s52pg\") pod \"manila-db-create-657cv\" (UID: \"eadcaa3c-623a-409a-b735-2a38854c8036\") " pod="openstack/manila-db-create-657cv" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.473141 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.516048 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-657cv" Feb 19 23:08:49 crc kubenswrapper[4795]: I0219 23:08:49.996639 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-4647-account-create-update-f7k78"] Feb 19 23:08:50 crc kubenswrapper[4795]: W0219 23:08:50.001571 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a6dde49_b59e_4a4a_ad84_6386aa1dc6ce.slice/crio-0786c807ea4bdfb3b6185a4c22709a17c9d628ec80bed4391f9f91b63673b9b7 WatchSource:0}: Error finding container 0786c807ea4bdfb3b6185a4c22709a17c9d628ec80bed4391f9f91b63673b9b7: Status 404 returned error can't find the container with id 0786c807ea4bdfb3b6185a4c22709a17c9d628ec80bed4391f9f91b63673b9b7 Feb 19 23:08:50 crc kubenswrapper[4795]: I0219 23:08:50.148742 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-657cv"] Feb 19 23:08:50 crc kubenswrapper[4795]: W0219 23:08:50.148859 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeadcaa3c_623a_409a_b735_2a38854c8036.slice/crio-602d5deb01290c1783ead482081b843b246c4fe53c5dc8800e837fd62c4812ff WatchSource:0}: Error finding container 602d5deb01290c1783ead482081b843b246c4fe53c5dc8800e837fd62c4812ff: Status 404 returned error can't find the container with id 602d5deb01290c1783ead482081b843b246c4fe53c5dc8800e837fd62c4812ff Feb 19 23:08:50 crc kubenswrapper[4795]: I0219 23:08:50.335568 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4647-account-create-update-f7k78" event={"ID":"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce","Type":"ContainerStarted","Data":"74c8a86b466b15b56eddcfa8140418aa598bd95ce7f01643ab2976a1bd25cfe5"} Feb 19 23:08:50 crc kubenswrapper[4795]: I0219 23:08:50.335612 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4647-account-create-update-f7k78" event={"ID":"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce","Type":"ContainerStarted","Data":"0786c807ea4bdfb3b6185a4c22709a17c9d628ec80bed4391f9f91b63673b9b7"} Feb 19 23:08:50 crc kubenswrapper[4795]: I0219 23:08:50.337271 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-657cv" event={"ID":"eadcaa3c-623a-409a-b735-2a38854c8036","Type":"ContainerStarted","Data":"2537d15feb1a2c6f922f8d39bb6ed8ef442cb362fc714c8399a87d6d69b4784d"} Feb 19 23:08:50 crc kubenswrapper[4795]: I0219 23:08:50.337314 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-657cv" event={"ID":"eadcaa3c-623a-409a-b735-2a38854c8036","Type":"ContainerStarted","Data":"602d5deb01290c1783ead482081b843b246c4fe53c5dc8800e837fd62c4812ff"} Feb 19 23:08:50 crc kubenswrapper[4795]: I0219 23:08:50.353026 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-4647-account-create-update-f7k78" podStartSLOduration=1.353008083 podStartE2EDuration="1.353008083s" podCreationTimestamp="2026-02-19 23:08:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:08:50.350572838 +0000 UTC m=+6041.543090692" watchObservedRunningTime="2026-02-19 23:08:50.353008083 +0000 UTC m=+6041.545525947" Feb 19 23:08:50 crc kubenswrapper[4795]: I0219 23:08:50.390532 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-657cv" podStartSLOduration=1.390513532 podStartE2EDuration="1.390513532s" podCreationTimestamp="2026-02-19 23:08:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:08:50.367749716 +0000 UTC m=+6041.560267590" watchObservedRunningTime="2026-02-19 23:08:50.390513532 +0000 UTC m=+6041.583031396" Feb 19 23:08:51 crc kubenswrapper[4795]: I0219 23:08:51.351518 4795 generic.go:334] "Generic (PLEG): container finished" podID="4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce" containerID="74c8a86b466b15b56eddcfa8140418aa598bd95ce7f01643ab2976a1bd25cfe5" exitCode=0 Feb 19 23:08:51 crc kubenswrapper[4795]: I0219 23:08:51.352395 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4647-account-create-update-f7k78" event={"ID":"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce","Type":"ContainerDied","Data":"74c8a86b466b15b56eddcfa8140418aa598bd95ce7f01643ab2976a1bd25cfe5"} Feb 19 23:08:51 crc kubenswrapper[4795]: I0219 23:08:51.354245 4795 generic.go:334] "Generic (PLEG): container finished" podID="eadcaa3c-623a-409a-b735-2a38854c8036" containerID="2537d15feb1a2c6f922f8d39bb6ed8ef442cb362fc714c8399a87d6d69b4784d" exitCode=0 Feb 19 23:08:51 crc kubenswrapper[4795]: I0219 23:08:51.354293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-657cv" event={"ID":"eadcaa3c-623a-409a-b735-2a38854c8036","Type":"ContainerDied","Data":"2537d15feb1a2c6f922f8d39bb6ed8ef442cb362fc714c8399a87d6d69b4784d"} Feb 19 23:08:52 crc kubenswrapper[4795]: I0219 23:08:52.957478 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-657cv" Feb 19 23:08:52 crc kubenswrapper[4795]: I0219 23:08:52.961844 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.126629 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s52pg\" (UniqueName: \"kubernetes.io/projected/eadcaa3c-623a-409a-b735-2a38854c8036-kube-api-access-s52pg\") pod \"eadcaa3c-623a-409a-b735-2a38854c8036\" (UID: \"eadcaa3c-623a-409a-b735-2a38854c8036\") " Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.127037 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-operator-scripts\") pod \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\" (UID: \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\") " Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.127078 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr7xp\" (UniqueName: \"kubernetes.io/projected/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-kube-api-access-cr7xp\") pod \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\" (UID: \"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce\") " Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.127104 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eadcaa3c-623a-409a-b735-2a38854c8036-operator-scripts\") pod \"eadcaa3c-623a-409a-b735-2a38854c8036\" (UID: \"eadcaa3c-623a-409a-b735-2a38854c8036\") " Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.127561 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce" (UID: "4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.127630 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eadcaa3c-623a-409a-b735-2a38854c8036-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eadcaa3c-623a-409a-b735-2a38854c8036" (UID: "eadcaa3c-623a-409a-b735-2a38854c8036"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.132559 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-kube-api-access-cr7xp" (OuterVolumeSpecName: "kube-api-access-cr7xp") pod "4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce" (UID: "4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce"). InnerVolumeSpecName "kube-api-access-cr7xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.132980 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eadcaa3c-623a-409a-b735-2a38854c8036-kube-api-access-s52pg" (OuterVolumeSpecName: "kube-api-access-s52pg") pod "eadcaa3c-623a-409a-b735-2a38854c8036" (UID: "eadcaa3c-623a-409a-b735-2a38854c8036"). InnerVolumeSpecName "kube-api-access-s52pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.228950 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.228987 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr7xp\" (UniqueName: \"kubernetes.io/projected/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce-kube-api-access-cr7xp\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.228997 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eadcaa3c-623a-409a-b735-2a38854c8036-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.229007 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s52pg\" (UniqueName: \"kubernetes.io/projected/eadcaa3c-623a-409a-b735-2a38854c8036-kube-api-access-s52pg\") on node \"crc\" DevicePath \"\"" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.376471 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4647-account-create-update-f7k78" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.376286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4647-account-create-update-f7k78" event={"ID":"4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce","Type":"ContainerDied","Data":"0786c807ea4bdfb3b6185a4c22709a17c9d628ec80bed4391f9f91b63673b9b7"} Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.378297 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0786c807ea4bdfb3b6185a4c22709a17c9d628ec80bed4391f9f91b63673b9b7" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.386584 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-657cv" event={"ID":"eadcaa3c-623a-409a-b735-2a38854c8036","Type":"ContainerDied","Data":"602d5deb01290c1783ead482081b843b246c4fe53c5dc8800e837fd62c4812ff"} Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.386623 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="602d5deb01290c1783ead482081b843b246c4fe53c5dc8800e837fd62c4812ff" Feb 19 23:08:53 crc kubenswrapper[4795]: I0219 23:08:53.386686 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-657cv" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.630065 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-8fzxs"] Feb 19 23:08:54 crc kubenswrapper[4795]: E0219 23:08:54.632271 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce" containerName="mariadb-account-create-update" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.632294 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce" containerName="mariadb-account-create-update" Feb 19 23:08:54 crc kubenswrapper[4795]: E0219 23:08:54.632337 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eadcaa3c-623a-409a-b735-2a38854c8036" containerName="mariadb-database-create" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.632348 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="eadcaa3c-623a-409a-b735-2a38854c8036" containerName="mariadb-database-create" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.632615 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce" containerName="mariadb-account-create-update" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.632660 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="eadcaa3c-623a-409a-b735-2a38854c8036" containerName="mariadb-database-create" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.633606 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.636342 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-g5f8g" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.642258 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-8fzxs"] Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.642349 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.771924 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-job-config-data\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.772034 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-674wp\" (UniqueName: \"kubernetes.io/projected/bd19113b-623e-4f3e-8392-09968a5d71f9-kube-api-access-674wp\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.772136 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-combined-ca-bundle\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.772152 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-config-data\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.873655 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-combined-ca-bundle\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.873976 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-config-data\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.874059 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-job-config-data\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.874158 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-674wp\" (UniqueName: \"kubernetes.io/projected/bd19113b-623e-4f3e-8392-09968a5d71f9-kube-api-access-674wp\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.887049 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-combined-ca-bundle\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.887053 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-job-config-data\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.887278 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-config-data\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.889515 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-674wp\" (UniqueName: \"kubernetes.io/projected/bd19113b-623e-4f3e-8392-09968a5d71f9-kube-api-access-674wp\") pod \"manila-db-sync-8fzxs\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:54 crc kubenswrapper[4795]: I0219 23:08:54.962628 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-8fzxs" Feb 19 23:08:55 crc kubenswrapper[4795]: I0219 23:08:55.760241 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-8fzxs"] Feb 19 23:08:56 crc kubenswrapper[4795]: I0219 23:08:56.412596 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-8fzxs" event={"ID":"bd19113b-623e-4f3e-8392-09968a5d71f9","Type":"ContainerStarted","Data":"2de2f6bee0fa48983ed9cf80a52a1fcd93154a98d917c415b7634301a3151ebf"} Feb 19 23:08:59 crc kubenswrapper[4795]: I0219 23:08:59.033149 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-ltwc9"] Feb 19 23:08:59 crc kubenswrapper[4795]: I0219 23:08:59.047084 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-ltwc9"] Feb 19 23:08:59 crc kubenswrapper[4795]: I0219 23:08:59.526047 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0532ff51-023e-4663-9c95-6545236a8fb3" path="/var/lib/kubelet/pods/0532ff51-023e-4663-9c95-6545236a8fb3/volumes" Feb 19 23:09:01 crc kubenswrapper[4795]: I0219 23:09:01.459911 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-8fzxs" event={"ID":"bd19113b-623e-4f3e-8392-09968a5d71f9","Type":"ContainerStarted","Data":"a15bc43318a5f56769ebc9469003d0ddb425e341f35a3faca3d202f2707e3b2d"} Feb 19 23:09:01 crc kubenswrapper[4795]: I0219 23:09:01.483976 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-8fzxs" podStartSLOduration=2.917938107 podStartE2EDuration="7.483959205s" podCreationTimestamp="2026-02-19 23:08:54 +0000 UTC" firstStartedPulling="2026-02-19 23:08:55.775250543 +0000 UTC m=+6046.967768407" lastFinishedPulling="2026-02-19 23:09:00.341271641 +0000 UTC m=+6051.533789505" observedRunningTime="2026-02-19 23:09:01.476538398 +0000 UTC m=+6052.669056272" watchObservedRunningTime="2026-02-19 23:09:01.483959205 +0000 UTC m=+6052.676477069" Feb 19 23:09:03 crc kubenswrapper[4795]: I0219 23:09:03.478431 4795 generic.go:334] "Generic (PLEG): container finished" podID="bd19113b-623e-4f3e-8392-09968a5d71f9" containerID="a15bc43318a5f56769ebc9469003d0ddb425e341f35a3faca3d202f2707e3b2d" exitCode=0 Feb 19 23:09:03 crc kubenswrapper[4795]: I0219 23:09:03.478591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-8fzxs" event={"ID":"bd19113b-623e-4f3e-8392-09968a5d71f9","Type":"ContainerDied","Data":"a15bc43318a5f56769ebc9469003d0ddb425e341f35a3faca3d202f2707e3b2d"} Feb 19 23:09:04 crc kubenswrapper[4795]: I0219 23:09:04.960635 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-8fzxs" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.109967 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-job-config-data\") pod \"bd19113b-623e-4f3e-8392-09968a5d71f9\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.110141 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-674wp\" (UniqueName: \"kubernetes.io/projected/bd19113b-623e-4f3e-8392-09968a5d71f9-kube-api-access-674wp\") pod \"bd19113b-623e-4f3e-8392-09968a5d71f9\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.110186 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-combined-ca-bundle\") pod \"bd19113b-623e-4f3e-8392-09968a5d71f9\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.110229 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-config-data\") pod \"bd19113b-623e-4f3e-8392-09968a5d71f9\" (UID: \"bd19113b-623e-4f3e-8392-09968a5d71f9\") " Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.115956 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd19113b-623e-4f3e-8392-09968a5d71f9-kube-api-access-674wp" (OuterVolumeSpecName: "kube-api-access-674wp") pod "bd19113b-623e-4f3e-8392-09968a5d71f9" (UID: "bd19113b-623e-4f3e-8392-09968a5d71f9"). InnerVolumeSpecName "kube-api-access-674wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.116265 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "bd19113b-623e-4f3e-8392-09968a5d71f9" (UID: "bd19113b-623e-4f3e-8392-09968a5d71f9"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.117756 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-config-data" (OuterVolumeSpecName: "config-data") pod "bd19113b-623e-4f3e-8392-09968a5d71f9" (UID: "bd19113b-623e-4f3e-8392-09968a5d71f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.139841 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd19113b-623e-4f3e-8392-09968a5d71f9" (UID: "bd19113b-623e-4f3e-8392-09968a5d71f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.212981 4795 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-job-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.213193 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-674wp\" (UniqueName: \"kubernetes.io/projected/bd19113b-623e-4f3e-8392-09968a5d71f9-kube-api-access-674wp\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.213263 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.213347 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd19113b-623e-4f3e-8392-09968a5d71f9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.497924 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-8fzxs" event={"ID":"bd19113b-623e-4f3e-8392-09968a5d71f9","Type":"ContainerDied","Data":"2de2f6bee0fa48983ed9cf80a52a1fcd93154a98d917c415b7634301a3151ebf"} Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.497961 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2de2f6bee0fa48983ed9cf80a52a1fcd93154a98d917c415b7634301a3151ebf" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.497967 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-8fzxs" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.813782 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Feb 19 23:09:05 crc kubenswrapper[4795]: E0219 23:09:05.814788 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd19113b-623e-4f3e-8392-09968a5d71f9" containerName="manila-db-sync" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.814809 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd19113b-623e-4f3e-8392-09968a5d71f9" containerName="manila-db-sync" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.825607 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd19113b-623e-4f3e-8392-09968a5d71f9" containerName="manila-db-sync" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.827956 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.837342 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.837933 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.838399 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.844316 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-g5f8g" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.873912 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.916129 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.920579 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.923828 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.930046 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.933567 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-scripts\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.933787 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-config-data\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.933996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.936305 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.936465 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g28d5\" (UniqueName: \"kubernetes.io/projected/25554074-26bb-4b62-a1f9-dac4cd6308b4-kube-api-access-g28d5\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.936555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25554074-26bb-4b62-a1f9-dac4cd6308b4-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.953797 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c4f55b9f-ltwn7"] Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.956340 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:05 crc kubenswrapper[4795]: I0219 23:09:05.963293 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c4f55b9f-ltwn7"] Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039183 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wdb8\" (UniqueName: \"kubernetes.io/projected/30effec6-7cdf-4ef1-b828-ff6327bb6bce-kube-api-access-6wdb8\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-config-data\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039274 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039289 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-nb\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039331 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-config-data\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039359 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-dns-svc\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039374 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-config\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039461 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-sb\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039484 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039519 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039539 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g28d5\" (UniqueName: \"kubernetes.io/projected/25554074-26bb-4b62-a1f9-dac4cd6308b4-kube-api-access-g28d5\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25554074-26bb-4b62-a1f9-dac4cd6308b4-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039581 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4nzg\" (UniqueName: \"kubernetes.io/projected/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-kube-api-access-c4nzg\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039622 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-ceph\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039654 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-scripts\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039669 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-scripts\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.039687 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.040102 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25554074-26bb-4b62-a1f9-dac4cd6308b4-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.046571 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-scripts\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.047071 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.047344 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.047789 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25554074-26bb-4b62-a1f9-dac4cd6308b4-config-data\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.058684 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g28d5\" (UniqueName: \"kubernetes.io/projected/25554074-26bb-4b62-a1f9-dac4cd6308b4-kube-api-access-g28d5\") pod \"manila-scheduler-0\" (UID: \"25554074-26bb-4b62-a1f9-dac4cd6308b4\") " pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.095249 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.097253 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.100839 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.119185 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.141942 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-ceph\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.142272 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-scripts\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.142387 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.142502 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.142698 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wdb8\" (UniqueName: \"kubernetes.io/projected/30effec6-7cdf-4ef1-b828-ff6327bb6bce-kube-api-access-6wdb8\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.142793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.142897 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-nb\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.143009 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-config-data\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.143112 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-dns-svc\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.143233 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-config\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.143389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-sb\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.143512 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.143622 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4nzg\" (UniqueName: \"kubernetes.io/projected/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-kube-api-access-c4nzg\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.145275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-nb\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.146392 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.150014 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-config-data\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.150788 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-dns-svc\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.151394 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-config\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.152354 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.153270 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.153443 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-sb\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.162703 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.164897 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-ceph\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.170435 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-scripts\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.170961 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.178955 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4nzg\" (UniqueName: \"kubernetes.io/projected/48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864-kube-api-access-c4nzg\") pod \"manila-share-share1-0\" (UID: \"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864\") " pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.185071 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wdb8\" (UniqueName: \"kubernetes.io/projected/30effec6-7cdf-4ef1-b828-ff6327bb6bce-kube-api-access-6wdb8\") pod \"dnsmasq-dns-6c4f55b9f-ltwn7\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.246631 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-etc-machine-id\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.246686 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-config-data-custom\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.246916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-logs\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.247051 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-scripts\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.247282 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.247497 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-config-data\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.247608 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77q29\" (UniqueName: \"kubernetes.io/projected/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-kube-api-access-77q29\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.262193 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.279236 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.349554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.349668 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-config-data\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.349711 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77q29\" (UniqueName: \"kubernetes.io/projected/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-kube-api-access-77q29\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.349787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-etc-machine-id\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.349810 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-config-data-custom\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.349886 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-logs\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.349917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-scripts\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.352058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-logs\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.352131 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-etc-machine-id\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.356943 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-scripts\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.359243 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.372978 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-config-data-custom\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.373620 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-config-data\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.392865 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77q29\" (UniqueName: \"kubernetes.io/projected/b7cef1f6-95e4-4ccd-8a2a-49c27373a96d-kube-api-access-77q29\") pod \"manila-api-0\" (UID: \"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d\") " pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.593522 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 19 23:09:06 crc kubenswrapper[4795]: I0219 23:09:06.731939 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 19 23:09:06 crc kubenswrapper[4795]: W0219 23:09:06.760309 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25554074_26bb_4b62_a1f9_dac4cd6308b4.slice/crio-3340f030743b87c44179e06785a9f331f79071662606164b653198513ed17653 WatchSource:0}: Error finding container 3340f030743b87c44179e06785a9f331f79071662606164b653198513ed17653: Status 404 returned error can't find the container with id 3340f030743b87c44179e06785a9f331f79071662606164b653198513ed17653 Feb 19 23:09:07 crc kubenswrapper[4795]: I0219 23:09:07.103218 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 19 23:09:07 crc kubenswrapper[4795]: I0219 23:09:07.114914 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c4f55b9f-ltwn7"] Feb 19 23:09:07 crc kubenswrapper[4795]: I0219 23:09:07.187995 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 19 23:09:07 crc kubenswrapper[4795]: I0219 23:09:07.566304 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" event={"ID":"30effec6-7cdf-4ef1-b828-ff6327bb6bce","Type":"ContainerStarted","Data":"78d47d6b1b605f27ba7148a48207891b6a8dfb70fcf1cbffd1a75f4272077818"} Feb 19 23:09:07 crc kubenswrapper[4795]: I0219 23:09:07.572981 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"25554074-26bb-4b62-a1f9-dac4cd6308b4","Type":"ContainerStarted","Data":"3340f030743b87c44179e06785a9f331f79071662606164b653198513ed17653"} Feb 19 23:09:07 crc kubenswrapper[4795]: I0219 23:09:07.575193 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864","Type":"ContainerStarted","Data":"12303e2812dd7f41d5986b133f2354eb3e2eb37ec80e42c2761c4a2c65a83170"} Feb 19 23:09:07 crc kubenswrapper[4795]: I0219 23:09:07.577232 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d","Type":"ContainerStarted","Data":"16c2bad66bbd5e23781823b78ccb5fbbafc69f6c84985671756edddcf7e88b1a"} Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.599807 4795 generic.go:334] "Generic (PLEG): container finished" podID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" containerID="1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52" exitCode=0 Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.599888 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" event={"ID":"30effec6-7cdf-4ef1-b828-ff6327bb6bce","Type":"ContainerDied","Data":"1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52"} Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.625003 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"25554074-26bb-4b62-a1f9-dac4cd6308b4","Type":"ContainerStarted","Data":"80b774f5e004853fd442b3883c009e31328c0024b5ff1f0bf808913283df8cec"} Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.625065 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"25554074-26bb-4b62-a1f9-dac4cd6308b4","Type":"ContainerStarted","Data":"197912b640f4bcb72c1bc7f3064ff9382adb0f54d892c2e831f7536ed5c2c540"} Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.640571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d","Type":"ContainerStarted","Data":"619ee3a23c0cee7450cc7808138eb296572fce554dc64ef68c9f4a217ae21bb1"} Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.643420 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"b7cef1f6-95e4-4ccd-8a2a-49c27373a96d","Type":"ContainerStarted","Data":"9a37176c13c4409f76767df52c8daaeb1ed37ac49259ce43926381ed6dda07df"} Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.643467 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.676325 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.051637834 podStartE2EDuration="3.676304216s" podCreationTimestamp="2026-02-19 23:09:05 +0000 UTC" firstStartedPulling="2026-02-19 23:09:06.767394058 +0000 UTC m=+6057.959911922" lastFinishedPulling="2026-02-19 23:09:07.39206044 +0000 UTC m=+6058.584578304" observedRunningTime="2026-02-19 23:09:08.662070157 +0000 UTC m=+6059.854588021" watchObservedRunningTime="2026-02-19 23:09:08.676304216 +0000 UTC m=+6059.868822080" Feb 19 23:09:08 crc kubenswrapper[4795]: I0219 23:09:08.741999 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=2.7419829460000003 podStartE2EDuration="2.741982946s" podCreationTimestamp="2026-02-19 23:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:09:08.741459582 +0000 UTC m=+6059.933977446" watchObservedRunningTime="2026-02-19 23:09:08.741982946 +0000 UTC m=+6059.934500800" Feb 19 23:09:09 crc kubenswrapper[4795]: I0219 23:09:09.656718 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" event={"ID":"30effec6-7cdf-4ef1-b828-ff6327bb6bce","Type":"ContainerStarted","Data":"971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b"} Feb 19 23:09:09 crc kubenswrapper[4795]: I0219 23:09:09.683069 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" podStartSLOduration=4.683050758 podStartE2EDuration="4.683050758s" podCreationTimestamp="2026-02-19 23:09:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:09:09.673677188 +0000 UTC m=+6060.866195052" watchObservedRunningTime="2026-02-19 23:09:09.683050758 +0000 UTC m=+6060.875568622" Feb 19 23:09:10 crc kubenswrapper[4795]: I0219 23:09:10.672018 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:13 crc kubenswrapper[4795]: I0219 23:09:13.683756 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 23:09:15 crc kubenswrapper[4795]: I0219 23:09:15.766682 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864","Type":"ContainerStarted","Data":"9258f71dca60325ce87ec0600932b5afe531b726018c02e457dd257e1fd79f11"} Feb 19 23:09:15 crc kubenswrapper[4795]: I0219 23:09:15.767238 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864","Type":"ContainerStarted","Data":"8b3ee7d3c7c3b6df6ad6d84f9683cbddbc63cf20f7beb14ba96413da1c664f62"} Feb 19 23:09:15 crc kubenswrapper[4795]: I0219 23:09:15.791341 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.580446182 podStartE2EDuration="10.791325256s" podCreationTimestamp="2026-02-19 23:09:05 +0000 UTC" firstStartedPulling="2026-02-19 23:09:07.098491999 +0000 UTC m=+6058.291009863" lastFinishedPulling="2026-02-19 23:09:14.309371073 +0000 UTC m=+6065.501888937" observedRunningTime="2026-02-19 23:09:15.790102883 +0000 UTC m=+6066.982620767" watchObservedRunningTime="2026-02-19 23:09:15.791325256 +0000 UTC m=+6066.983843120" Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.162938 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.272389 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.280412 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.357926 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9677b4c57-7nn9w"] Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.358216 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" podUID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" containerName="dnsmasq-dns" containerID="cri-o://af38521e1f3eb2665f893506c1a46deb190e71a060f854d967c38b1e5c2dbb46" gracePeriod=10 Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.782397 4795 generic.go:334] "Generic (PLEG): container finished" podID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" containerID="af38521e1f3eb2665f893506c1a46deb190e71a060f854d967c38b1e5c2dbb46" exitCode=0 Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.784226 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" event={"ID":"b20710ae-8abe-4d80-8cdf-582fe785e2cc","Type":"ContainerDied","Data":"af38521e1f3eb2665f893506c1a46deb190e71a060f854d967c38b1e5c2dbb46"} Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.945598 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.996198 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-sb\") pod \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.996265 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc6jd\" (UniqueName: \"kubernetes.io/projected/b20710ae-8abe-4d80-8cdf-582fe785e2cc-kube-api-access-vc6jd\") pod \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.996471 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-nb\") pod \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.996541 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-config\") pod \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " Feb 19 23:09:16 crc kubenswrapper[4795]: I0219 23:09:16.996720 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-dns-svc\") pod \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\" (UID: \"b20710ae-8abe-4d80-8cdf-582fe785e2cc\") " Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.003000 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20710ae-8abe-4d80-8cdf-582fe785e2cc-kube-api-access-vc6jd" (OuterVolumeSpecName: "kube-api-access-vc6jd") pod "b20710ae-8abe-4d80-8cdf-582fe785e2cc" (UID: "b20710ae-8abe-4d80-8cdf-582fe785e2cc"). InnerVolumeSpecName "kube-api-access-vc6jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.068087 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b20710ae-8abe-4d80-8cdf-582fe785e2cc" (UID: "b20710ae-8abe-4d80-8cdf-582fe785e2cc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.080146 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-config" (OuterVolumeSpecName: "config") pod "b20710ae-8abe-4d80-8cdf-582fe785e2cc" (UID: "b20710ae-8abe-4d80-8cdf-582fe785e2cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.097391 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b20710ae-8abe-4d80-8cdf-582fe785e2cc" (UID: "b20710ae-8abe-4d80-8cdf-582fe785e2cc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.099183 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.099201 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.099213 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc6jd\" (UniqueName: \"kubernetes.io/projected/b20710ae-8abe-4d80-8cdf-582fe785e2cc-kube-api-access-vc6jd\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.099223 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.103088 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b20710ae-8abe-4d80-8cdf-582fe785e2cc" (UID: "b20710ae-8abe-4d80-8cdf-582fe785e2cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.201333 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b20710ae-8abe-4d80-8cdf-582fe785e2cc-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.793671 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.794283 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9677b4c57-7nn9w" event={"ID":"b20710ae-8abe-4d80-8cdf-582fe785e2cc","Type":"ContainerDied","Data":"676d66baf2862b20915b7f3f4da6df9613656d85d389c8b7a38b315539d7e5e8"} Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.794323 4795 scope.go:117] "RemoveContainer" containerID="af38521e1f3eb2665f893506c1a46deb190e71a060f854d967c38b1e5c2dbb46" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.816694 4795 scope.go:117] "RemoveContainer" containerID="b799c6817a13193660264c0e1a6b6bbda173865957b6b9f053b5f17c7df00f19" Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.823132 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9677b4c57-7nn9w"] Feb 19 23:09:17 crc kubenswrapper[4795]: I0219 23:09:17.864349 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9677b4c57-7nn9w"] Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.409452 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.410075 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="ceilometer-central-agent" containerID="cri-o://44a3b39a44dc6cf8dbe9434bdd68e38ca2c4f2db3b91cc4b6a6862525ce949fc" gracePeriod=30 Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.410236 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="ceilometer-notification-agent" containerID="cri-o://ca8c993b6a51e98c0db74360e21684626b8c5fc48a0421fcdb15ff26685a9b39" gracePeriod=30 Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.410222 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="sg-core" containerID="cri-o://807c413b53a5b1af9be38437a438f28044e30b65d6e5738bee5968445dc73a94" gracePeriod=30 Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.410419 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="proxy-httpd" containerID="cri-o://32823af9fc2a8e52e61f0605bdd2b17739e02e0d4106a20b34de910c13396fbb" gracePeriod=30 Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.523132 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" path="/var/lib/kubelet/pods/b20710ae-8abe-4d80-8cdf-582fe785e2cc/volumes" Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.831681 4795 generic.go:334] "Generic (PLEG): container finished" podID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerID="32823af9fc2a8e52e61f0605bdd2b17739e02e0d4106a20b34de910c13396fbb" exitCode=0 Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.832176 4795 generic.go:334] "Generic (PLEG): container finished" podID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerID="807c413b53a5b1af9be38437a438f28044e30b65d6e5738bee5968445dc73a94" exitCode=2 Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.831795 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerDied","Data":"32823af9fc2a8e52e61f0605bdd2b17739e02e0d4106a20b34de910c13396fbb"} Feb 19 23:09:19 crc kubenswrapper[4795]: I0219 23:09:19.832223 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerDied","Data":"807c413b53a5b1af9be38437a438f28044e30b65d6e5738bee5968445dc73a94"} Feb 19 23:09:20 crc kubenswrapper[4795]: I0219 23:09:20.845020 4795 generic.go:334] "Generic (PLEG): container finished" podID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerID="44a3b39a44dc6cf8dbe9434bdd68e38ca2c4f2db3b91cc4b6a6862525ce949fc" exitCode=0 Feb 19 23:09:20 crc kubenswrapper[4795]: I0219 23:09:20.845080 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerDied","Data":"44a3b39a44dc6cf8dbe9434bdd68e38ca2c4f2db3b91cc4b6a6862525ce949fc"} Feb 19 23:09:21 crc kubenswrapper[4795]: I0219 23:09:21.862375 4795 generic.go:334] "Generic (PLEG): container finished" podID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerID="ca8c993b6a51e98c0db74360e21684626b8c5fc48a0421fcdb15ff26685a9b39" exitCode=0 Feb 19 23:09:21 crc kubenswrapper[4795]: I0219 23:09:21.863061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerDied","Data":"ca8c993b6a51e98c0db74360e21684626b8c5fc48a0421fcdb15ff26685a9b39"} Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.093766 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.152081 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-config-data\") pod \"d7d7882a-86ad-4b48-8280-af2ced7c6807\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.152119 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-combined-ca-bundle\") pod \"d7d7882a-86ad-4b48-8280-af2ced7c6807\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.152200 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stfh6\" (UniqueName: \"kubernetes.io/projected/d7d7882a-86ad-4b48-8280-af2ced7c6807-kube-api-access-stfh6\") pod \"d7d7882a-86ad-4b48-8280-af2ced7c6807\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.152268 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-run-httpd\") pod \"d7d7882a-86ad-4b48-8280-af2ced7c6807\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.152294 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-log-httpd\") pod \"d7d7882a-86ad-4b48-8280-af2ced7c6807\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.152500 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-scripts\") pod \"d7d7882a-86ad-4b48-8280-af2ced7c6807\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.152545 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-sg-core-conf-yaml\") pod \"d7d7882a-86ad-4b48-8280-af2ced7c6807\" (UID: \"d7d7882a-86ad-4b48-8280-af2ced7c6807\") " Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.153765 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d7d7882a-86ad-4b48-8280-af2ced7c6807" (UID: "d7d7882a-86ad-4b48-8280-af2ced7c6807"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.153832 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d7d7882a-86ad-4b48-8280-af2ced7c6807" (UID: "d7d7882a-86ad-4b48-8280-af2ced7c6807"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.161430 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-scripts" (OuterVolumeSpecName: "scripts") pod "d7d7882a-86ad-4b48-8280-af2ced7c6807" (UID: "d7d7882a-86ad-4b48-8280-af2ced7c6807"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.174434 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d7882a-86ad-4b48-8280-af2ced7c6807-kube-api-access-stfh6" (OuterVolumeSpecName: "kube-api-access-stfh6") pod "d7d7882a-86ad-4b48-8280-af2ced7c6807" (UID: "d7d7882a-86ad-4b48-8280-af2ced7c6807"). InnerVolumeSpecName "kube-api-access-stfh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.215463 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d7d7882a-86ad-4b48-8280-af2ced7c6807" (UID: "d7d7882a-86ad-4b48-8280-af2ced7c6807"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.254615 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.254651 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.254662 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stfh6\" (UniqueName: \"kubernetes.io/projected/d7d7882a-86ad-4b48-8280-af2ced7c6807-kube-api-access-stfh6\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.254671 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.254679 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7d7882a-86ad-4b48-8280-af2ced7c6807-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.264297 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-config-data" (OuterVolumeSpecName: "config-data") pod "d7d7882a-86ad-4b48-8280-af2ced7c6807" (UID: "d7d7882a-86ad-4b48-8280-af2ced7c6807"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.282320 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7d7882a-86ad-4b48-8280-af2ced7c6807" (UID: "d7d7882a-86ad-4b48-8280-af2ced7c6807"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.356329 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.356378 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7d7882a-86ad-4b48-8280-af2ced7c6807-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.873552 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7d7882a-86ad-4b48-8280-af2ced7c6807","Type":"ContainerDied","Data":"2b454f16774527928b9df78521f93b16c0055d0b08e28e664f0730245cc4e62b"} Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.873633 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.873923 4795 scope.go:117] "RemoveContainer" containerID="32823af9fc2a8e52e61f0605bdd2b17739e02e0d4106a20b34de910c13396fbb" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.893073 4795 scope.go:117] "RemoveContainer" containerID="807c413b53a5b1af9be38437a438f28044e30b65d6e5738bee5968445dc73a94" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.919087 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.922470 4795 scope.go:117] "RemoveContainer" containerID="ca8c993b6a51e98c0db74360e21684626b8c5fc48a0421fcdb15ff26685a9b39" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.932517 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.943874 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:09:22 crc kubenswrapper[4795]: E0219 23:09:22.944353 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" containerName="dnsmasq-dns" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944371 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" containerName="dnsmasq-dns" Feb 19 23:09:22 crc kubenswrapper[4795]: E0219 23:09:22.944395 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="ceilometer-central-agent" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944403 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="ceilometer-central-agent" Feb 19 23:09:22 crc kubenswrapper[4795]: E0219 23:09:22.944418 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="sg-core" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944424 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="sg-core" Feb 19 23:09:22 crc kubenswrapper[4795]: E0219 23:09:22.944437 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" containerName="init" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944442 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" containerName="init" Feb 19 23:09:22 crc kubenswrapper[4795]: E0219 23:09:22.944452 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="proxy-httpd" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944460 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="proxy-httpd" Feb 19 23:09:22 crc kubenswrapper[4795]: E0219 23:09:22.944472 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="ceilometer-notification-agent" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944478 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="ceilometer-notification-agent" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944661 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="proxy-httpd" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944675 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b20710ae-8abe-4d80-8cdf-582fe785e2cc" containerName="dnsmasq-dns" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944688 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="ceilometer-central-agent" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944700 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="sg-core" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.944717 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" containerName="ceilometer-notification-agent" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.946665 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.948877 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.950269 4795 scope.go:117] "RemoveContainer" containerID="44a3b39a44dc6cf8dbe9434bdd68e38ca2c4f2db3b91cc4b6a6862525ce949fc" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.950403 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.957932 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.974691 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-config-data\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.974785 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-run-httpd\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.974835 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.974917 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9sx2\" (UniqueName: \"kubernetes.io/projected/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-kube-api-access-k9sx2\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.974942 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-scripts\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.974989 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:22 crc kubenswrapper[4795]: I0219 23:09:22.975038 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-log-httpd\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.078422 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9sx2\" (UniqueName: \"kubernetes.io/projected/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-kube-api-access-k9sx2\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.078495 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-scripts\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.078569 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.078645 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-log-httpd\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.078826 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-config-data\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.078878 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-run-httpd\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.078936 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.085332 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.089542 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-scripts\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.093059 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.093511 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-log-httpd\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.098322 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-config-data\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.098727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-run-httpd\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.107137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9sx2\" (UniqueName: \"kubernetes.io/projected/87a2d0b8-5866-4d88-ab91-cd94c2136c6c-kube-api-access-k9sx2\") pod \"ceilometer-0\" (UID: \"87a2d0b8-5866-4d88-ab91-cd94c2136c6c\") " pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.263840 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.527445 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d7882a-86ad-4b48-8280-af2ced7c6807" path="/var/lib/kubelet/pods/d7d7882a-86ad-4b48-8280-af2ced7c6807/volumes" Feb 19 23:09:23 crc kubenswrapper[4795]: W0219 23:09:23.834568 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87a2d0b8_5866_4d88_ab91_cd94c2136c6c.slice/crio-20a146dd837978cead831ac646b17a84a953d23ffc337a7ad35c8db3d22c778e WatchSource:0}: Error finding container 20a146dd837978cead831ac646b17a84a953d23ffc337a7ad35c8db3d22c778e: Status 404 returned error can't find the container with id 20a146dd837978cead831ac646b17a84a953d23ffc337a7ad35c8db3d22c778e Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.837749 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.843391 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 23:09:23 crc kubenswrapper[4795]: I0219 23:09:23.884209 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87a2d0b8-5866-4d88-ab91-cd94c2136c6c","Type":"ContainerStarted","Data":"20a146dd837978cead831ac646b17a84a953d23ffc337a7ad35c8db3d22c778e"} Feb 19 23:09:24 crc kubenswrapper[4795]: I0219 23:09:24.894038 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87a2d0b8-5866-4d88-ab91-cd94c2136c6c","Type":"ContainerStarted","Data":"d3b098895da80405758df0193d773c7f712d32d244e01634a1b1139445cb875e"} Feb 19 23:09:25 crc kubenswrapper[4795]: I0219 23:09:25.844509 4795 scope.go:117] "RemoveContainer" containerID="2096e22df9aade6dc70570abe24d3dbe208045474ec5189f56a5d1beac2fa739" Feb 19 23:09:25 crc kubenswrapper[4795]: I0219 23:09:25.871861 4795 scope.go:117] "RemoveContainer" containerID="eb9dfa4e109128ae704882fd6740207250d7781752ee79e05d86190dadc20b22" Feb 19 23:09:25 crc kubenswrapper[4795]: I0219 23:09:25.905023 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87a2d0b8-5866-4d88-ab91-cd94c2136c6c","Type":"ContainerStarted","Data":"a0e565fdba0c7abd230f8945d0858103ea10b5326777256cbafabf6bace4a840"} Feb 19 23:09:25 crc kubenswrapper[4795]: I0219 23:09:25.921420 4795 scope.go:117] "RemoveContainer" containerID="eae4ff82798e1d79080d2f3fe94f82eb86492da01dc7f84fe2ae4aff95927ca0" Feb 19 23:09:25 crc kubenswrapper[4795]: I0219 23:09:25.953152 4795 scope.go:117] "RemoveContainer" containerID="fd9abe25e7c7b1351c86593c296366b3dd62af78525a82ce72038120bd5feb1c" Feb 19 23:09:25 crc kubenswrapper[4795]: I0219 23:09:25.989839 4795 scope.go:117] "RemoveContainer" containerID="3e6c87188fd15e9809b882f54d39f3a4062c4abbd3e889c36c86f66a2c644f00" Feb 19 23:09:26 crc kubenswrapper[4795]: I0219 23:09:26.917419 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87a2d0b8-5866-4d88-ab91-cd94c2136c6c","Type":"ContainerStarted","Data":"92523c6404f017ec2be90d32dc4cf26336fb7adcf1830fd82adee31e7f856fa0"} Feb 19 23:09:27 crc kubenswrapper[4795]: I0219 23:09:27.793146 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Feb 19 23:09:27 crc kubenswrapper[4795]: I0219 23:09:27.799489 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Feb 19 23:09:27 crc kubenswrapper[4795]: I0219 23:09:27.926214 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87a2d0b8-5866-4d88-ab91-cd94c2136c6c","Type":"ContainerStarted","Data":"a30654e19b3666fea6ec0cf0d65c9b3a5c7aaa72197820ff0b31b291411781d3"} Feb 19 23:09:27 crc kubenswrapper[4795]: I0219 23:09:27.927417 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 23:09:27 crc kubenswrapper[4795]: I0219 23:09:27.955820 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.742023213 podStartE2EDuration="5.955800256s" podCreationTimestamp="2026-02-19 23:09:22 +0000 UTC" firstStartedPulling="2026-02-19 23:09:23.837489144 +0000 UTC m=+6075.030007008" lastFinishedPulling="2026-02-19 23:09:27.051266187 +0000 UTC m=+6078.243784051" observedRunningTime="2026-02-19 23:09:27.942140972 +0000 UTC m=+6079.134658836" watchObservedRunningTime="2026-02-19 23:09:27.955800256 +0000 UTC m=+6079.148318130" Feb 19 23:09:28 crc kubenswrapper[4795]: I0219 23:09:28.207788 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Feb 19 23:09:28 crc kubenswrapper[4795]: I0219 23:09:28.427049 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:09:28 crc kubenswrapper[4795]: I0219 23:09:28.427493 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:09:42 crc kubenswrapper[4795]: I0219 23:09:42.053942 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-0a84-account-create-update-pl5gt"] Feb 19 23:09:42 crc kubenswrapper[4795]: I0219 23:09:42.068900 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-z6zgv"] Feb 19 23:09:42 crc kubenswrapper[4795]: I0219 23:09:42.084155 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-0a84-account-create-update-pl5gt"] Feb 19 23:09:42 crc kubenswrapper[4795]: I0219 23:09:42.094896 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-z6zgv"] Feb 19 23:09:43 crc kubenswrapper[4795]: I0219 23:09:43.540427 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eae15d3-0be7-4510-9803-a7ad3f947148" path="/var/lib/kubelet/pods/6eae15d3-0be7-4510-9803-a7ad3f947148/volumes" Feb 19 23:09:43 crc kubenswrapper[4795]: I0219 23:09:43.541905 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de06c33e-b82b-46eb-964b-4bdd02c94166" path="/var/lib/kubelet/pods/de06c33e-b82b-46eb-964b-4bdd02c94166/volumes" Feb 19 23:09:50 crc kubenswrapper[4795]: I0219 23:09:50.037853 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-mn2kc"] Feb 19 23:09:50 crc kubenswrapper[4795]: I0219 23:09:50.055193 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-mn2kc"] Feb 19 23:09:51 crc kubenswrapper[4795]: I0219 23:09:51.527083 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51673183-2fe8-4a11-98f0-dec10081e7fc" path="/var/lib/kubelet/pods/51673183-2fe8-4a11-98f0-dec10081e7fc/volumes" Feb 19 23:09:53 crc kubenswrapper[4795]: I0219 23:09:53.268231 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 23:09:58 crc kubenswrapper[4795]: I0219 23:09:58.427349 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:09:58 crc kubenswrapper[4795]: I0219 23:09:58.428093 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.673193 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-857b684d55-kkmvk"] Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.675383 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.677191 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.686135 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-857b684d55-kkmvk"] Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.769784 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-nb\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.769957 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-dns-svc\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.770045 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-openstack-cell1\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.770070 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t5jz\" (UniqueName: \"kubernetes.io/projected/b3d4a3d4-7002-422c-af86-2500e4c15e0b-kube-api-access-9t5jz\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.770103 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-config\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.770272 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-sb\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.872228 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-openstack-cell1\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.872268 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t5jz\" (UniqueName: \"kubernetes.io/projected/b3d4a3d4-7002-422c-af86-2500e4c15e0b-kube-api-access-9t5jz\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.872295 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-config\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.872358 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-sb\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.872393 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-nb\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.872470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-dns-svc\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.873320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-dns-svc\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.873530 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-sb\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.873627 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-openstack-cell1\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.873803 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-config\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.874079 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-nb\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.891979 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t5jz\" (UniqueName: \"kubernetes.io/projected/b3d4a3d4-7002-422c-af86-2500e4c15e0b-kube-api-access-9t5jz\") pod \"dnsmasq-dns-857b684d55-kkmvk\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:12 crc kubenswrapper[4795]: I0219 23:10:12.992375 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:13 crc kubenswrapper[4795]: I0219 23:10:13.479957 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-857b684d55-kkmvk"] Feb 19 23:10:14 crc kubenswrapper[4795]: I0219 23:10:14.378197 4795 generic.go:334] "Generic (PLEG): container finished" podID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" containerID="2e01aa79b584e456d7bb177df7e3c0e3d240599ed160c0a0253a370986182831" exitCode=0 Feb 19 23:10:14 crc kubenswrapper[4795]: I0219 23:10:14.378261 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" event={"ID":"b3d4a3d4-7002-422c-af86-2500e4c15e0b","Type":"ContainerDied","Data":"2e01aa79b584e456d7bb177df7e3c0e3d240599ed160c0a0253a370986182831"} Feb 19 23:10:14 crc kubenswrapper[4795]: I0219 23:10:14.378761 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" event={"ID":"b3d4a3d4-7002-422c-af86-2500e4c15e0b","Type":"ContainerStarted","Data":"330f81a91d8abda42411ed5c57208925e27d58bc7869c4940f8759238ea316af"} Feb 19 23:10:15 crc kubenswrapper[4795]: I0219 23:10:15.388115 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" event={"ID":"b3d4a3d4-7002-422c-af86-2500e4c15e0b","Type":"ContainerStarted","Data":"6a99b05d5f7a67497a684c2cb94a28dfa50917779aaccd71fbb28d10ab5f2cd7"} Feb 19 23:10:15 crc kubenswrapper[4795]: I0219 23:10:15.388572 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:15 crc kubenswrapper[4795]: I0219 23:10:15.411313 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" podStartSLOduration=3.411288436 podStartE2EDuration="3.411288436s" podCreationTimestamp="2026-02-19 23:10:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:10:15.406141999 +0000 UTC m=+6126.598659863" watchObservedRunningTime="2026-02-19 23:10:15.411288436 +0000 UTC m=+6126.603806300" Feb 19 23:10:22 crc kubenswrapper[4795]: I0219 23:10:22.994451 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.057522 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c4f55b9f-ltwn7"] Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.057743 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" podUID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" containerName="dnsmasq-dns" containerID="cri-o://971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b" gracePeriod=10 Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.209470 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67d97dc55-pvbjd"] Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.212720 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.239518 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67d97dc55-pvbjd"] Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.309055 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-ovsdbserver-nb\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.309105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-ovsdbserver-sb\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.309148 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-config\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.309314 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glsg7\" (UniqueName: \"kubernetes.io/projected/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-kube-api-access-glsg7\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.309405 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-openstack-cell1\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.309493 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-dns-svc\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.411247 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-ovsdbserver-nb\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.411659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-ovsdbserver-sb\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.411696 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-config\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.411746 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glsg7\" (UniqueName: \"kubernetes.io/projected/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-kube-api-access-glsg7\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.411777 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-openstack-cell1\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.411801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-dns-svc\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.412513 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-ovsdbserver-nb\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.412738 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-config\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.412874 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-dns-svc\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.412878 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-openstack-cell1\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.413220 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-ovsdbserver-sb\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.442372 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glsg7\" (UniqueName: \"kubernetes.io/projected/f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0-kube-api-access-glsg7\") pod \"dnsmasq-dns-67d97dc55-pvbjd\" (UID: \"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0\") " pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.541519 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.642980 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.716549 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-dns-svc\") pod \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.716654 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-config\") pod \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.716703 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-nb\") pod \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.716725 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-sb\") pod \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.716775 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wdb8\" (UniqueName: \"kubernetes.io/projected/30effec6-7cdf-4ef1-b828-ff6327bb6bce-kube-api-access-6wdb8\") pod \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\" (UID: \"30effec6-7cdf-4ef1-b828-ff6327bb6bce\") " Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.734453 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30effec6-7cdf-4ef1-b828-ff6327bb6bce-kube-api-access-6wdb8" (OuterVolumeSpecName: "kube-api-access-6wdb8") pod "30effec6-7cdf-4ef1-b828-ff6327bb6bce" (UID: "30effec6-7cdf-4ef1-b828-ff6327bb6bce"). InnerVolumeSpecName "kube-api-access-6wdb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.806927 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-config" (OuterVolumeSpecName: "config") pod "30effec6-7cdf-4ef1-b828-ff6327bb6bce" (UID: "30effec6-7cdf-4ef1-b828-ff6327bb6bce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.819106 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.819140 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wdb8\" (UniqueName: \"kubernetes.io/projected/30effec6-7cdf-4ef1-b828-ff6327bb6bce-kube-api-access-6wdb8\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.819642 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "30effec6-7cdf-4ef1-b828-ff6327bb6bce" (UID: "30effec6-7cdf-4ef1-b828-ff6327bb6bce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.825897 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "30effec6-7cdf-4ef1-b828-ff6327bb6bce" (UID: "30effec6-7cdf-4ef1-b828-ff6327bb6bce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.828853 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "30effec6-7cdf-4ef1-b828-ff6327bb6bce" (UID: "30effec6-7cdf-4ef1-b828-ff6327bb6bce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.893464 4795 generic.go:334] "Generic (PLEG): container finished" podID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" containerID="971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b" exitCode=0 Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.893501 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" event={"ID":"30effec6-7cdf-4ef1-b828-ff6327bb6bce","Type":"ContainerDied","Data":"971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b"} Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.893525 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" event={"ID":"30effec6-7cdf-4ef1-b828-ff6327bb6bce","Type":"ContainerDied","Data":"78d47d6b1b605f27ba7148a48207891b6a8dfb70fcf1cbffd1a75f4272077818"} Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.893543 4795 scope.go:117] "RemoveContainer" containerID="971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.893654 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c4f55b9f-ltwn7" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.917312 4795 scope.go:117] "RemoveContainer" containerID="1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.920457 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.920482 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.920492 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30effec6-7cdf-4ef1-b828-ff6327bb6bce-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.935335 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c4f55b9f-ltwn7"] Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.946958 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c4f55b9f-ltwn7"] Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.962062 4795 scope.go:117] "RemoveContainer" containerID="971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b" Feb 19 23:10:23 crc kubenswrapper[4795]: E0219 23:10:23.962676 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b\": container with ID starting with 971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b not found: ID does not exist" containerID="971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.962803 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b"} err="failed to get container status \"971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b\": rpc error: code = NotFound desc = could not find container \"971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b\": container with ID starting with 971e5c0737888f0489bc10f6f20325ece6b9f13c0a3485472032845f358aba5b not found: ID does not exist" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.962901 4795 scope.go:117] "RemoveContainer" containerID="1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52" Feb 19 23:10:23 crc kubenswrapper[4795]: E0219 23:10:23.963484 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52\": container with ID starting with 1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52 not found: ID does not exist" containerID="1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52" Feb 19 23:10:23 crc kubenswrapper[4795]: I0219 23:10:23.963589 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52"} err="failed to get container status \"1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52\": rpc error: code = NotFound desc = could not find container \"1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52\": container with ID starting with 1b8149ccac8d7737b0dd0a0f63b35ed22eb45afb237991b5cd443f37a9b00a52 not found: ID does not exist" Feb 19 23:10:24 crc kubenswrapper[4795]: I0219 23:10:24.003848 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67d97dc55-pvbjd"] Feb 19 23:10:24 crc kubenswrapper[4795]: I0219 23:10:24.907772 4795 generic.go:334] "Generic (PLEG): container finished" podID="f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0" containerID="257ca0891e832a13103416a629b50498355f02d46fc45d6dd4a1d4893c3d0e68" exitCode=0 Feb 19 23:10:24 crc kubenswrapper[4795]: I0219 23:10:24.907889 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" event={"ID":"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0","Type":"ContainerDied","Data":"257ca0891e832a13103416a629b50498355f02d46fc45d6dd4a1d4893c3d0e68"} Feb 19 23:10:24 crc kubenswrapper[4795]: I0219 23:10:24.908141 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" event={"ID":"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0","Type":"ContainerStarted","Data":"26c39c8d6d5c335161f6b2b6d3ebdc63222bb1f8be5c31d6aa4639cd5621bb72"} Feb 19 23:10:25 crc kubenswrapper[4795]: I0219 23:10:25.524819 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" path="/var/lib/kubelet/pods/30effec6-7cdf-4ef1-b828-ff6327bb6bce/volumes" Feb 19 23:10:25 crc kubenswrapper[4795]: I0219 23:10:25.920610 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" event={"ID":"f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0","Type":"ContainerStarted","Data":"867b817d4ada73b53e336ceb69eb248dff3e7fbc04feedd0aa95fb5b38aa93ae"} Feb 19 23:10:25 crc kubenswrapper[4795]: I0219 23:10:25.920743 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:25 crc kubenswrapper[4795]: I0219 23:10:25.947651 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" podStartSLOduration=2.947618408 podStartE2EDuration="2.947618408s" podCreationTimestamp="2026-02-19 23:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:10:25.941224238 +0000 UTC m=+6137.133742122" watchObservedRunningTime="2026-02-19 23:10:25.947618408 +0000 UTC m=+6137.140136292" Feb 19 23:10:26 crc kubenswrapper[4795]: I0219 23:10:26.234344 4795 scope.go:117] "RemoveContainer" containerID="dc19c0a9eee505e65f65e0357256fb1d1ef9373c082944f9697154c21d026163" Feb 19 23:10:26 crc kubenswrapper[4795]: I0219 23:10:26.261334 4795 scope.go:117] "RemoveContainer" containerID="87dbef2ca275f0dcf2d6fcb445371c808cbb03bd9ce8927982987b0e668dfab9" Feb 19 23:10:26 crc kubenswrapper[4795]: I0219 23:10:26.354421 4795 scope.go:117] "RemoveContainer" containerID="ad3831db9a9f6e12a5ac1eb158cce87c770b2e137cadf64a7ed412c1aedd54c2" Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.427667 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.429819 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.429972 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.431077 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0fe96e51c4e702a3b8fdcd9d997ef35d626772b563d6c998f84fa6863685a9d"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.431262 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://b0fe96e51c4e702a3b8fdcd9d997ef35d626772b563d6c998f84fa6863685a9d" gracePeriod=600 Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.988021 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="b0fe96e51c4e702a3b8fdcd9d997ef35d626772b563d6c998f84fa6863685a9d" exitCode=0 Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.988126 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"b0fe96e51c4e702a3b8fdcd9d997ef35d626772b563d6c998f84fa6863685a9d"} Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.988444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4"} Feb 19 23:10:28 crc kubenswrapper[4795]: I0219 23:10:28.988477 4795 scope.go:117] "RemoveContainer" containerID="18170a71e3fd2c593ee17e2961f48e8d3663518910bf7cf2c79258393902484c" Feb 19 23:10:33 crc kubenswrapper[4795]: I0219 23:10:33.567006 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67d97dc55-pvbjd" Feb 19 23:10:33 crc kubenswrapper[4795]: I0219 23:10:33.648410 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-857b684d55-kkmvk"] Feb 19 23:10:33 crc kubenswrapper[4795]: I0219 23:10:33.649391 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" podUID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" containerName="dnsmasq-dns" containerID="cri-o://6a99b05d5f7a67497a684c2cb94a28dfa50917779aaccd71fbb28d10ab5f2cd7" gracePeriod=10 Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.050301 4795 generic.go:334] "Generic (PLEG): container finished" podID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" containerID="6a99b05d5f7a67497a684c2cb94a28dfa50917779aaccd71fbb28d10ab5f2cd7" exitCode=0 Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.050647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" event={"ID":"b3d4a3d4-7002-422c-af86-2500e4c15e0b","Type":"ContainerDied","Data":"6a99b05d5f7a67497a684c2cb94a28dfa50917779aaccd71fbb28d10ab5f2cd7"} Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.160829 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.301011 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-sb\") pod \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.301137 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-openstack-cell1\") pod \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.301304 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-dns-svc\") pod \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.301378 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-nb\") pod \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.301463 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-config\") pod \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.301513 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t5jz\" (UniqueName: \"kubernetes.io/projected/b3d4a3d4-7002-422c-af86-2500e4c15e0b-kube-api-access-9t5jz\") pod \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\" (UID: \"b3d4a3d4-7002-422c-af86-2500e4c15e0b\") " Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.311476 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d4a3d4-7002-422c-af86-2500e4c15e0b-kube-api-access-9t5jz" (OuterVolumeSpecName: "kube-api-access-9t5jz") pod "b3d4a3d4-7002-422c-af86-2500e4c15e0b" (UID: "b3d4a3d4-7002-422c-af86-2500e4c15e0b"). InnerVolumeSpecName "kube-api-access-9t5jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.368691 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3d4a3d4-7002-422c-af86-2500e4c15e0b" (UID: "b3d4a3d4-7002-422c-af86-2500e4c15e0b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.373411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b3d4a3d4-7002-422c-af86-2500e4c15e0b" (UID: "b3d4a3d4-7002-422c-af86-2500e4c15e0b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.383566 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b3d4a3d4-7002-422c-af86-2500e4c15e0b" (UID: "b3d4a3d4-7002-422c-af86-2500e4c15e0b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.389712 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "b3d4a3d4-7002-422c-af86-2500e4c15e0b" (UID: "b3d4a3d4-7002-422c-af86-2500e4c15e0b"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.398357 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-config" (OuterVolumeSpecName: "config") pod "b3d4a3d4-7002-422c-af86-2500e4c15e0b" (UID: "b3d4a3d4-7002-422c-af86-2500e4c15e0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.404529 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.404561 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.404572 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.404585 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-config\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.404597 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t5jz\" (UniqueName: \"kubernetes.io/projected/b3d4a3d4-7002-422c-af86-2500e4c15e0b-kube-api-access-9t5jz\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:34 crc kubenswrapper[4795]: I0219 23:10:34.404610 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3d4a3d4-7002-422c-af86-2500e4c15e0b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:35 crc kubenswrapper[4795]: I0219 23:10:35.063313 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" event={"ID":"b3d4a3d4-7002-422c-af86-2500e4c15e0b","Type":"ContainerDied","Data":"330f81a91d8abda42411ed5c57208925e27d58bc7869c4940f8759238ea316af"} Feb 19 23:10:35 crc kubenswrapper[4795]: I0219 23:10:35.064050 4795 scope.go:117] "RemoveContainer" containerID="6a99b05d5f7a67497a684c2cb94a28dfa50917779aaccd71fbb28d10ab5f2cd7" Feb 19 23:10:35 crc kubenswrapper[4795]: I0219 23:10:35.063683 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-857b684d55-kkmvk" Feb 19 23:10:35 crc kubenswrapper[4795]: I0219 23:10:35.095300 4795 scope.go:117] "RemoveContainer" containerID="2e01aa79b584e456d7bb177df7e3c0e3d240599ed160c0a0253a370986182831" Feb 19 23:10:35 crc kubenswrapper[4795]: I0219 23:10:35.122190 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-857b684d55-kkmvk"] Feb 19 23:10:35 crc kubenswrapper[4795]: I0219 23:10:35.138698 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-857b684d55-kkmvk"] Feb 19 23:10:35 crc kubenswrapper[4795]: I0219 23:10:35.522594 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" path="/var/lib/kubelet/pods/b3d4a3d4-7002-422c-af86-2500e4c15e0b/volumes" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.708472 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8dlb8"] Feb 19 23:10:36 crc kubenswrapper[4795]: E0219 23:10:36.709320 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" containerName="init" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.709335 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" containerName="init" Feb 19 23:10:36 crc kubenswrapper[4795]: E0219 23:10:36.709369 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" containerName="init" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.709375 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" containerName="init" Feb 19 23:10:36 crc kubenswrapper[4795]: E0219 23:10:36.709387 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" containerName="dnsmasq-dns" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.709392 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" containerName="dnsmasq-dns" Feb 19 23:10:36 crc kubenswrapper[4795]: E0219 23:10:36.709411 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" containerName="dnsmasq-dns" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.709417 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" containerName="dnsmasq-dns" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.709602 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d4a3d4-7002-422c-af86-2500e4c15e0b" containerName="dnsmasq-dns" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.709623 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="30effec6-7cdf-4ef1-b828-ff6327bb6bce" containerName="dnsmasq-dns" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.711102 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.723341 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8dlb8"] Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.762947 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-catalog-content\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.763261 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-utilities\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.763472 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvpl5\" (UniqueName: \"kubernetes.io/projected/2bc48201-bd6d-4727-90c2-562889c16c68-kube-api-access-qvpl5\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.864851 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-catalog-content\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.865612 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-utilities\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.865317 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-catalog-content\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.865885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-utilities\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.866262 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvpl5\" (UniqueName: \"kubernetes.io/projected/2bc48201-bd6d-4727-90c2-562889c16c68-kube-api-access-qvpl5\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:36 crc kubenswrapper[4795]: I0219 23:10:36.887773 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvpl5\" (UniqueName: \"kubernetes.io/projected/2bc48201-bd6d-4727-90c2-562889c16c68-kube-api-access-qvpl5\") pod \"community-operators-8dlb8\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:37 crc kubenswrapper[4795]: I0219 23:10:37.029866 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:37 crc kubenswrapper[4795]: I0219 23:10:37.509861 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8dlb8"] Feb 19 23:10:37 crc kubenswrapper[4795]: W0219 23:10:37.517883 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bc48201_bd6d_4727_90c2_562889c16c68.slice/crio-06a96660172a86537e02ded953308945bb0365e63bf489599714d5e3a40fa70e WatchSource:0}: Error finding container 06a96660172a86537e02ded953308945bb0365e63bf489599714d5e3a40fa70e: Status 404 returned error can't find the container with id 06a96660172a86537e02ded953308945bb0365e63bf489599714d5e3a40fa70e Feb 19 23:10:38 crc kubenswrapper[4795]: I0219 23:10:38.090958 4795 generic.go:334] "Generic (PLEG): container finished" podID="2bc48201-bd6d-4727-90c2-562889c16c68" containerID="6a05abe6f4069b29be6b5f5fcbf8ded158fa515f69f5981aa82331e623d393e5" exitCode=0 Feb 19 23:10:38 crc kubenswrapper[4795]: I0219 23:10:38.091014 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dlb8" event={"ID":"2bc48201-bd6d-4727-90c2-562889c16c68","Type":"ContainerDied","Data":"6a05abe6f4069b29be6b5f5fcbf8ded158fa515f69f5981aa82331e623d393e5"} Feb 19 23:10:38 crc kubenswrapper[4795]: I0219 23:10:38.091374 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dlb8" event={"ID":"2bc48201-bd6d-4727-90c2-562889c16c68","Type":"ContainerStarted","Data":"06a96660172a86537e02ded953308945bb0365e63bf489599714d5e3a40fa70e"} Feb 19 23:10:39 crc kubenswrapper[4795]: I0219 23:10:39.102123 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dlb8" event={"ID":"2bc48201-bd6d-4727-90c2-562889c16c68","Type":"ContainerStarted","Data":"eb261c992c6e6e012c7b1df86458acbc0994a9998ef4e3b76a32dcc8a4e0e786"} Feb 19 23:10:41 crc kubenswrapper[4795]: I0219 23:10:41.123758 4795 generic.go:334] "Generic (PLEG): container finished" podID="2bc48201-bd6d-4727-90c2-562889c16c68" containerID="eb261c992c6e6e012c7b1df86458acbc0994a9998ef4e3b76a32dcc8a4e0e786" exitCode=0 Feb 19 23:10:41 crc kubenswrapper[4795]: I0219 23:10:41.123831 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dlb8" event={"ID":"2bc48201-bd6d-4727-90c2-562889c16c68","Type":"ContainerDied","Data":"eb261c992c6e6e012c7b1df86458acbc0994a9998ef4e3b76a32dcc8a4e0e786"} Feb 19 23:10:42 crc kubenswrapper[4795]: I0219 23:10:42.136905 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dlb8" event={"ID":"2bc48201-bd6d-4727-90c2-562889c16c68","Type":"ContainerStarted","Data":"3128b7f06c3d841b4f1d4173d5bffe26bb16b404b8ef664155c2167e6d730fa0"} Feb 19 23:10:42 crc kubenswrapper[4795]: I0219 23:10:42.167279 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8dlb8" podStartSLOduration=2.758856389 podStartE2EDuration="6.167261517s" podCreationTimestamp="2026-02-19 23:10:36 +0000 UTC" firstStartedPulling="2026-02-19 23:10:38.093256316 +0000 UTC m=+6149.285774180" lastFinishedPulling="2026-02-19 23:10:41.501661444 +0000 UTC m=+6152.694179308" observedRunningTime="2026-02-19 23:10:42.157887877 +0000 UTC m=+6153.350405741" watchObservedRunningTime="2026-02-19 23:10:42.167261517 +0000 UTC m=+6153.359779381" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.715082 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n"] Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.717016 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.719408 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.719531 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.719644 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.719693 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.733395 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n"] Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.823497 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.823538 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s97d\" (UniqueName: \"kubernetes.io/projected/7ba2e854-6881-4f7f-8068-7abf4df26229-kube-api-access-2s97d\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.823607 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.823709 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.823736 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.925673 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.926008 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.926637 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.926664 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s97d\" (UniqueName: \"kubernetes.io/projected/7ba2e854-6881-4f7f-8068-7abf4df26229-kube-api-access-2s97d\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.926719 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.931935 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.933979 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.934464 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.946446 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s97d\" (UniqueName: \"kubernetes.io/projected/7ba2e854-6881-4f7f-8068-7abf4df26229-kube-api-access-2s97d\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:44 crc kubenswrapper[4795]: I0219 23:10:44.949024 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:45 crc kubenswrapper[4795]: I0219 23:10:45.035530 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:10:45 crc kubenswrapper[4795]: W0219 23:10:45.616688 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ba2e854_6881_4f7f_8068_7abf4df26229.slice/crio-c4bb08ee4225aef2472cbe0144f71bf546c4fee5554133f74513ed9dda24f9cf WatchSource:0}: Error finding container c4bb08ee4225aef2472cbe0144f71bf546c4fee5554133f74513ed9dda24f9cf: Status 404 returned error can't find the container with id c4bb08ee4225aef2472cbe0144f71bf546c4fee5554133f74513ed9dda24f9cf Feb 19 23:10:45 crc kubenswrapper[4795]: I0219 23:10:45.620887 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n"] Feb 19 23:10:46 crc kubenswrapper[4795]: I0219 23:10:46.176345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" event={"ID":"7ba2e854-6881-4f7f-8068-7abf4df26229","Type":"ContainerStarted","Data":"c4bb08ee4225aef2472cbe0144f71bf546c4fee5554133f74513ed9dda24f9cf"} Feb 19 23:10:47 crc kubenswrapper[4795]: I0219 23:10:47.030737 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:47 crc kubenswrapper[4795]: I0219 23:10:47.031081 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:47 crc kubenswrapper[4795]: I0219 23:10:47.100877 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:47 crc kubenswrapper[4795]: I0219 23:10:47.236464 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:47 crc kubenswrapper[4795]: I0219 23:10:47.342066 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8dlb8"] Feb 19 23:10:49 crc kubenswrapper[4795]: I0219 23:10:49.203759 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8dlb8" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" containerName="registry-server" containerID="cri-o://3128b7f06c3d841b4f1d4173d5bffe26bb16b404b8ef664155c2167e6d730fa0" gracePeriod=2 Feb 19 23:10:50 crc kubenswrapper[4795]: I0219 23:10:50.215846 4795 generic.go:334] "Generic (PLEG): container finished" podID="2bc48201-bd6d-4727-90c2-562889c16c68" containerID="3128b7f06c3d841b4f1d4173d5bffe26bb16b404b8ef664155c2167e6d730fa0" exitCode=0 Feb 19 23:10:50 crc kubenswrapper[4795]: I0219 23:10:50.215898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dlb8" event={"ID":"2bc48201-bd6d-4727-90c2-562889c16c68","Type":"ContainerDied","Data":"3128b7f06c3d841b4f1d4173d5bffe26bb16b404b8ef664155c2167e6d730fa0"} Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.648697 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.776722 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-catalog-content\") pod \"2bc48201-bd6d-4727-90c2-562889c16c68\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.776916 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-utilities\") pod \"2bc48201-bd6d-4727-90c2-562889c16c68\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.777058 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvpl5\" (UniqueName: \"kubernetes.io/projected/2bc48201-bd6d-4727-90c2-562889c16c68-kube-api-access-qvpl5\") pod \"2bc48201-bd6d-4727-90c2-562889c16c68\" (UID: \"2bc48201-bd6d-4727-90c2-562889c16c68\") " Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.779287 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-utilities" (OuterVolumeSpecName: "utilities") pod "2bc48201-bd6d-4727-90c2-562889c16c68" (UID: "2bc48201-bd6d-4727-90c2-562889c16c68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.781260 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bc48201-bd6d-4727-90c2-562889c16c68-kube-api-access-qvpl5" (OuterVolumeSpecName: "kube-api-access-qvpl5") pod "2bc48201-bd6d-4727-90c2-562889c16c68" (UID: "2bc48201-bd6d-4727-90c2-562889c16c68"). InnerVolumeSpecName "kube-api-access-qvpl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.823299 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bc48201-bd6d-4727-90c2-562889c16c68" (UID: "2bc48201-bd6d-4727-90c2-562889c16c68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.880962 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.881238 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvpl5\" (UniqueName: \"kubernetes.io/projected/2bc48201-bd6d-4727-90c2-562889c16c68-kube-api-access-qvpl5\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.881322 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bc48201-bd6d-4727-90c2-562889c16c68-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.916474 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m4x8w"] Feb 19 23:10:55 crc kubenswrapper[4795]: E0219 23:10:55.917463 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" containerName="registry-server" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.917504 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" containerName="registry-server" Feb 19 23:10:55 crc kubenswrapper[4795]: E0219 23:10:55.917522 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" containerName="extract-content" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.917529 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" containerName="extract-content" Feb 19 23:10:55 crc kubenswrapper[4795]: E0219 23:10:55.917574 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" containerName="extract-utilities" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.917582 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" containerName="extract-utilities" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.918059 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" containerName="registry-server" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.920727 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:55 crc kubenswrapper[4795]: I0219 23:10:55.937497 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m4x8w"] Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.084584 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frs6x\" (UniqueName: \"kubernetes.io/projected/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-kube-api-access-frs6x\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.084626 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-utilities\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.084650 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-catalog-content\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.187034 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frs6x\" (UniqueName: \"kubernetes.io/projected/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-kube-api-access-frs6x\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.187093 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-utilities\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.187113 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-catalog-content\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.187931 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-catalog-content\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.188134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-utilities\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.202304 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frs6x\" (UniqueName: \"kubernetes.io/projected/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-kube-api-access-frs6x\") pod \"redhat-operators-m4x8w\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.244796 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.284485 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" event={"ID":"7ba2e854-6881-4f7f-8068-7abf4df26229","Type":"ContainerStarted","Data":"bcded33b25e3ae27e84f77a3f23c65d96a1486e81879f51358b255339adf4627"} Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.290661 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dlb8" event={"ID":"2bc48201-bd6d-4727-90c2-562889c16c68","Type":"ContainerDied","Data":"06a96660172a86537e02ded953308945bb0365e63bf489599714d5e3a40fa70e"} Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.290705 4795 scope.go:117] "RemoveContainer" containerID="3128b7f06c3d841b4f1d4173d5bffe26bb16b404b8ef664155c2167e6d730fa0" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.290777 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8dlb8" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.304796 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" podStartSLOduration=2.5750680299999997 podStartE2EDuration="12.304776082s" podCreationTimestamp="2026-02-19 23:10:44 +0000 UTC" firstStartedPulling="2026-02-19 23:10:45.619545333 +0000 UTC m=+6156.812063207" lastFinishedPulling="2026-02-19 23:10:55.349253405 +0000 UTC m=+6166.541771259" observedRunningTime="2026-02-19 23:10:56.299372578 +0000 UTC m=+6167.491890442" watchObservedRunningTime="2026-02-19 23:10:56.304776082 +0000 UTC m=+6167.497293946" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.406942 4795 scope.go:117] "RemoveContainer" containerID="eb261c992c6e6e012c7b1df86458acbc0994a9998ef4e3b76a32dcc8a4e0e786" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.427490 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8dlb8"] Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.441323 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8dlb8"] Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.464844 4795 scope.go:117] "RemoveContainer" containerID="6a05abe6f4069b29be6b5f5fcbf8ded158fa515f69f5981aa82331e623d393e5" Feb 19 23:10:56 crc kubenswrapper[4795]: I0219 23:10:56.807021 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m4x8w"] Feb 19 23:10:57 crc kubenswrapper[4795]: I0219 23:10:57.301799 4795 generic.go:334] "Generic (PLEG): container finished" podID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerID="2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b" exitCode=0 Feb 19 23:10:57 crc kubenswrapper[4795]: I0219 23:10:57.302000 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4x8w" event={"ID":"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15","Type":"ContainerDied","Data":"2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b"} Feb 19 23:10:57 crc kubenswrapper[4795]: I0219 23:10:57.302145 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4x8w" event={"ID":"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15","Type":"ContainerStarted","Data":"69a009f19892e689d5d7710989b3ec9c3b063552425a36b7ffeb2388d0a9aa9a"} Feb 19 23:10:57 crc kubenswrapper[4795]: I0219 23:10:57.526953 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bc48201-bd6d-4727-90c2-562889c16c68" path="/var/lib/kubelet/pods/2bc48201-bd6d-4727-90c2-562889c16c68/volumes" Feb 19 23:10:58 crc kubenswrapper[4795]: I0219 23:10:58.313339 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4x8w" event={"ID":"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15","Type":"ContainerStarted","Data":"cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671"} Feb 19 23:11:02 crc kubenswrapper[4795]: I0219 23:11:02.355466 4795 generic.go:334] "Generic (PLEG): container finished" podID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerID="cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671" exitCode=0 Feb 19 23:11:02 crc kubenswrapper[4795]: I0219 23:11:02.356133 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4x8w" event={"ID":"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15","Type":"ContainerDied","Data":"cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671"} Feb 19 23:11:03 crc kubenswrapper[4795]: I0219 23:11:03.367718 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4x8w" event={"ID":"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15","Type":"ContainerStarted","Data":"f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4"} Feb 19 23:11:03 crc kubenswrapper[4795]: I0219 23:11:03.387204 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m4x8w" podStartSLOduration=2.908654414 podStartE2EDuration="8.387182364s" podCreationTimestamp="2026-02-19 23:10:55 +0000 UTC" firstStartedPulling="2026-02-19 23:10:57.304428956 +0000 UTC m=+6168.496946820" lastFinishedPulling="2026-02-19 23:11:02.782956906 +0000 UTC m=+6173.975474770" observedRunningTime="2026-02-19 23:11:03.382569231 +0000 UTC m=+6174.575087095" watchObservedRunningTime="2026-02-19 23:11:03.387182364 +0000 UTC m=+6174.579700228" Feb 19 23:11:06 crc kubenswrapper[4795]: I0219 23:11:06.245412 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:11:06 crc kubenswrapper[4795]: I0219 23:11:06.247033 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:11:07 crc kubenswrapper[4795]: I0219 23:11:07.300997 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m4x8w" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="registry-server" probeResult="failure" output=< Feb 19 23:11:07 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:11:07 crc kubenswrapper[4795]: > Feb 19 23:11:08 crc kubenswrapper[4795]: I0219 23:11:08.420778 4795 generic.go:334] "Generic (PLEG): container finished" podID="7ba2e854-6881-4f7f-8068-7abf4df26229" containerID="bcded33b25e3ae27e84f77a3f23c65d96a1486e81879f51358b255339adf4627" exitCode=0 Feb 19 23:11:08 crc kubenswrapper[4795]: I0219 23:11:08.420861 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" event={"ID":"7ba2e854-6881-4f7f-8068-7abf4df26229","Type":"ContainerDied","Data":"bcded33b25e3ae27e84f77a3f23c65d96a1486e81879f51358b255339adf4627"} Feb 19 23:11:09 crc kubenswrapper[4795]: I0219 23:11:09.875617 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:11:09 crc kubenswrapper[4795]: I0219 23:11:09.983008 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ceph\") pod \"7ba2e854-6881-4f7f-8068-7abf4df26229\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " Feb 19 23:11:09 crc kubenswrapper[4795]: I0219 23:11:09.983408 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ssh-key-openstack-cell1\") pod \"7ba2e854-6881-4f7f-8068-7abf4df26229\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " Feb 19 23:11:09 crc kubenswrapper[4795]: I0219 23:11:09.983467 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-pre-adoption-validation-combined-ca-bundle\") pod \"7ba2e854-6881-4f7f-8068-7abf4df26229\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " Feb 19 23:11:09 crc kubenswrapper[4795]: I0219 23:11:09.983559 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s97d\" (UniqueName: \"kubernetes.io/projected/7ba2e854-6881-4f7f-8068-7abf4df26229-kube-api-access-2s97d\") pod \"7ba2e854-6881-4f7f-8068-7abf4df26229\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " Feb 19 23:11:09 crc kubenswrapper[4795]: I0219 23:11:09.983673 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-inventory\") pod \"7ba2e854-6881-4f7f-8068-7abf4df26229\" (UID: \"7ba2e854-6881-4f7f-8068-7abf4df26229\") " Feb 19 23:11:09 crc kubenswrapper[4795]: I0219 23:11:09.991415 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "7ba2e854-6881-4f7f-8068-7abf4df26229" (UID: "7ba2e854-6881-4f7f-8068-7abf4df26229"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.003869 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ba2e854-6881-4f7f-8068-7abf4df26229-kube-api-access-2s97d" (OuterVolumeSpecName: "kube-api-access-2s97d") pod "7ba2e854-6881-4f7f-8068-7abf4df26229" (UID: "7ba2e854-6881-4f7f-8068-7abf4df26229"). InnerVolumeSpecName "kube-api-access-2s97d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.004185 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ceph" (OuterVolumeSpecName: "ceph") pod "7ba2e854-6881-4f7f-8068-7abf4df26229" (UID: "7ba2e854-6881-4f7f-8068-7abf4df26229"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.014349 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "7ba2e854-6881-4f7f-8068-7abf4df26229" (UID: "7ba2e854-6881-4f7f-8068-7abf4df26229"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.019326 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-inventory" (OuterVolumeSpecName: "inventory") pod "7ba2e854-6881-4f7f-8068-7abf4df26229" (UID: "7ba2e854-6881-4f7f-8068-7abf4df26229"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.087044 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.087085 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.087120 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.087137 4795 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ba2e854-6881-4f7f-8068-7abf4df26229-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.087150 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s97d\" (UniqueName: \"kubernetes.io/projected/7ba2e854-6881-4f7f-8068-7abf4df26229-kube-api-access-2s97d\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.440354 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" event={"ID":"7ba2e854-6881-4f7f-8068-7abf4df26229","Type":"ContainerDied","Data":"c4bb08ee4225aef2472cbe0144f71bf546c4fee5554133f74513ed9dda24f9cf"} Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.440404 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4bb08ee4225aef2472cbe0144f71bf546c4fee5554133f74513ed9dda24f9cf" Feb 19 23:11:10 crc kubenswrapper[4795]: I0219 23:11:10.440436 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n" Feb 19 23:11:17 crc kubenswrapper[4795]: I0219 23:11:17.305862 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m4x8w" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="registry-server" probeResult="failure" output=< Feb 19 23:11:17 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:11:17 crc kubenswrapper[4795]: > Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.269732 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg"] Feb 19 23:11:18 crc kubenswrapper[4795]: E0219 23:11:18.270272 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ba2e854-6881-4f7f-8068-7abf4df26229" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.270297 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ba2e854-6881-4f7f-8068-7abf4df26229" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.270618 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ba2e854-6881-4f7f-8068-7abf4df26229" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.271562 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.272918 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh9wf\" (UniqueName: \"kubernetes.io/projected/c3cbdd11-d93f-4025-9c08-7530a68f6113-kube-api-access-hh9wf\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.273284 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.273394 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.273450 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.273475 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.274056 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.274501 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.276587 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.277974 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.279852 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg"] Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.375795 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh9wf\" (UniqueName: \"kubernetes.io/projected/c3cbdd11-d93f-4025-9c08-7530a68f6113-kube-api-access-hh9wf\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.375848 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.375897 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.375923 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.375941 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.381968 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.382980 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.382997 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.385985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.392347 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh9wf\" (UniqueName: \"kubernetes.io/projected/c3cbdd11-d93f-4025-9c08-7530a68f6113-kube-api-access-hh9wf\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:18 crc kubenswrapper[4795]: I0219 23:11:18.600814 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:11:19 crc kubenswrapper[4795]: I0219 23:11:19.147811 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg"] Feb 19 23:11:19 crc kubenswrapper[4795]: I0219 23:11:19.524983 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" event={"ID":"c3cbdd11-d93f-4025-9c08-7530a68f6113","Type":"ContainerStarted","Data":"0c55dc0c067482da01178354ee4aefa7baf05a359c6d442c03755a2406d2ea95"} Feb 19 23:11:20 crc kubenswrapper[4795]: I0219 23:11:20.543296 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" event={"ID":"c3cbdd11-d93f-4025-9c08-7530a68f6113","Type":"ContainerStarted","Data":"587260c6871d5ac2af0ecde5796363d9253fc80e5d5b060bd7a191b9752a30d4"} Feb 19 23:11:27 crc kubenswrapper[4795]: I0219 23:11:27.293315 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m4x8w" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="registry-server" probeResult="failure" output=< Feb 19 23:11:27 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:11:27 crc kubenswrapper[4795]: > Feb 19 23:11:36 crc kubenswrapper[4795]: I0219 23:11:36.293090 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:11:36 crc kubenswrapper[4795]: I0219 23:11:36.317605 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" podStartSLOduration=17.850097687999998 podStartE2EDuration="18.317574423s" podCreationTimestamp="2026-02-19 23:11:18 +0000 UTC" firstStartedPulling="2026-02-19 23:11:19.159180975 +0000 UTC m=+6190.351698839" lastFinishedPulling="2026-02-19 23:11:19.62665771 +0000 UTC m=+6190.819175574" observedRunningTime="2026-02-19 23:11:20.569286294 +0000 UTC m=+6191.761804158" watchObservedRunningTime="2026-02-19 23:11:36.317574423 +0000 UTC m=+6207.510092297" Feb 19 23:11:36 crc kubenswrapper[4795]: I0219 23:11:36.356357 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:11:36 crc kubenswrapper[4795]: I0219 23:11:36.545827 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m4x8w"] Feb 19 23:11:37 crc kubenswrapper[4795]: I0219 23:11:37.703365 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m4x8w" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="registry-server" containerID="cri-o://f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4" gracePeriod=2 Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.189085 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.233179 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-catalog-content\") pod \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.233301 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-utilities\") pod \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.233364 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frs6x\" (UniqueName: \"kubernetes.io/projected/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-kube-api-access-frs6x\") pod \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\" (UID: \"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15\") " Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.234672 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-utilities" (OuterVolumeSpecName: "utilities") pod "f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" (UID: "f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.238893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-kube-api-access-frs6x" (OuterVolumeSpecName: "kube-api-access-frs6x") pod "f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" (UID: "f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15"). InnerVolumeSpecName "kube-api-access-frs6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.335811 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.335852 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frs6x\" (UniqueName: \"kubernetes.io/projected/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-kube-api-access-frs6x\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.350374 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" (UID: "f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.437493 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.712445 4795 generic.go:334] "Generic (PLEG): container finished" podID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerID="f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4" exitCode=0 Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.712486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4x8w" event={"ID":"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15","Type":"ContainerDied","Data":"f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4"} Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.712523 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m4x8w" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.712538 4795 scope.go:117] "RemoveContainer" containerID="f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.712527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m4x8w" event={"ID":"f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15","Type":"ContainerDied","Data":"69a009f19892e689d5d7710989b3ec9c3b063552425a36b7ffeb2388d0a9aa9a"} Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.737991 4795 scope.go:117] "RemoveContainer" containerID="cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.752594 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m4x8w"] Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.761455 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m4x8w"] Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.774056 4795 scope.go:117] "RemoveContainer" containerID="2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.812442 4795 scope.go:117] "RemoveContainer" containerID="f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4" Feb 19 23:11:38 crc kubenswrapper[4795]: E0219 23:11:38.812829 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4\": container with ID starting with f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4 not found: ID does not exist" containerID="f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.812872 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4"} err="failed to get container status \"f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4\": rpc error: code = NotFound desc = could not find container \"f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4\": container with ID starting with f4746fc282e0aeb2af21a6ee7597a1bbd1b787b9a31e5c2b02895def99620fe4 not found: ID does not exist" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.812899 4795 scope.go:117] "RemoveContainer" containerID="cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671" Feb 19 23:11:38 crc kubenswrapper[4795]: E0219 23:11:38.813150 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671\": container with ID starting with cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671 not found: ID does not exist" containerID="cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.813187 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671"} err="failed to get container status \"cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671\": rpc error: code = NotFound desc = could not find container \"cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671\": container with ID starting with cb888e3e269a72b86567197db47985a0e1a268957c3e6bff15a8167cb9638671 not found: ID does not exist" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.813202 4795 scope.go:117] "RemoveContainer" containerID="2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b" Feb 19 23:11:38 crc kubenswrapper[4795]: E0219 23:11:38.813592 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b\": container with ID starting with 2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b not found: ID does not exist" containerID="2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b" Feb 19 23:11:38 crc kubenswrapper[4795]: I0219 23:11:38.813627 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b"} err="failed to get container status \"2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b\": rpc error: code = NotFound desc = could not find container \"2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b\": container with ID starting with 2559741c12229f925dd1d5e142ff010ee82861e5a3b7c05df0b350fec30d058b not found: ID does not exist" Feb 19 23:11:39 crc kubenswrapper[4795]: I0219 23:11:39.523488 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" path="/var/lib/kubelet/pods/f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15/volumes" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.936223 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mnftg"] Feb 19 23:11:54 crc kubenswrapper[4795]: E0219 23:11:54.937429 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="extract-utilities" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.937457 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="extract-utilities" Feb 19 23:11:54 crc kubenswrapper[4795]: E0219 23:11:54.937485 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="registry-server" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.937491 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="registry-server" Feb 19 23:11:54 crc kubenswrapper[4795]: E0219 23:11:54.937505 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="extract-content" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.937510 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="extract-content" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.937716 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f63b09dd-aeb7-4cbe-a542-d3c14ebdfa15" containerName="registry-server" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.939313 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.951642 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mnftg"] Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.985377 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-catalog-content\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.985509 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chpnw\" (UniqueName: \"kubernetes.io/projected/10309826-f6d9-49b0-a98c-1c31aab8ca7b-kube-api-access-chpnw\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:54 crc kubenswrapper[4795]: I0219 23:11:54.985575 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-utilities\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.086778 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chpnw\" (UniqueName: \"kubernetes.io/projected/10309826-f6d9-49b0-a98c-1c31aab8ca7b-kube-api-access-chpnw\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.086859 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-utilities\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.086949 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-catalog-content\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.087506 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-catalog-content\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.087724 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-utilities\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.105496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chpnw\" (UniqueName: \"kubernetes.io/projected/10309826-f6d9-49b0-a98c-1c31aab8ca7b-kube-api-access-chpnw\") pod \"certified-operators-mnftg\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.261229 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.760470 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mnftg"] Feb 19 23:11:55 crc kubenswrapper[4795]: W0219 23:11:55.762839 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10309826_f6d9_49b0_a98c_1c31aab8ca7b.slice/crio-8c9631e7617f04acaf017c583dcbddd280734220035ef66a47670736c26e2125 WatchSource:0}: Error finding container 8c9631e7617f04acaf017c583dcbddd280734220035ef66a47670736c26e2125: Status 404 returned error can't find the container with id 8c9631e7617f04acaf017c583dcbddd280734220035ef66a47670736c26e2125 Feb 19 23:11:55 crc kubenswrapper[4795]: I0219 23:11:55.882626 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnftg" event={"ID":"10309826-f6d9-49b0-a98c-1c31aab8ca7b","Type":"ContainerStarted","Data":"8c9631e7617f04acaf017c583dcbddd280734220035ef66a47670736c26e2125"} Feb 19 23:11:56 crc kubenswrapper[4795]: I0219 23:11:56.892699 4795 generic.go:334] "Generic (PLEG): container finished" podID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerID="f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20" exitCode=0 Feb 19 23:11:56 crc kubenswrapper[4795]: I0219 23:11:56.892757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnftg" event={"ID":"10309826-f6d9-49b0-a98c-1c31aab8ca7b","Type":"ContainerDied","Data":"f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20"} Feb 19 23:11:57 crc kubenswrapper[4795]: I0219 23:11:57.902305 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnftg" event={"ID":"10309826-f6d9-49b0-a98c-1c31aab8ca7b","Type":"ContainerStarted","Data":"d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b"} Feb 19 23:11:58 crc kubenswrapper[4795]: I0219 23:11:58.914866 4795 generic.go:334] "Generic (PLEG): container finished" podID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerID="d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b" exitCode=0 Feb 19 23:11:58 crc kubenswrapper[4795]: I0219 23:11:58.915059 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnftg" event={"ID":"10309826-f6d9-49b0-a98c-1c31aab8ca7b","Type":"ContainerDied","Data":"d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b"} Feb 19 23:11:59 crc kubenswrapper[4795]: I0219 23:11:59.935482 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnftg" event={"ID":"10309826-f6d9-49b0-a98c-1c31aab8ca7b","Type":"ContainerStarted","Data":"56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459"} Feb 19 23:12:05 crc kubenswrapper[4795]: I0219 23:12:05.262485 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:12:05 crc kubenswrapper[4795]: I0219 23:12:05.263154 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:12:05 crc kubenswrapper[4795]: I0219 23:12:05.341491 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:12:05 crc kubenswrapper[4795]: I0219 23:12:05.362923 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mnftg" podStartSLOduration=8.974244416 podStartE2EDuration="11.362906535s" podCreationTimestamp="2026-02-19 23:11:54 +0000 UTC" firstStartedPulling="2026-02-19 23:11:56.894576021 +0000 UTC m=+6228.087093885" lastFinishedPulling="2026-02-19 23:11:59.28323814 +0000 UTC m=+6230.475756004" observedRunningTime="2026-02-19 23:11:59.979647904 +0000 UTC m=+6231.172165788" watchObservedRunningTime="2026-02-19 23:12:05.362906535 +0000 UTC m=+6236.555424399" Feb 19 23:12:06 crc kubenswrapper[4795]: I0219 23:12:06.044783 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:12:06 crc kubenswrapper[4795]: I0219 23:12:06.096710 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mnftg"] Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.011723 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mnftg" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerName="registry-server" containerID="cri-o://56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459" gracePeriod=2 Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.511273 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.687894 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chpnw\" (UniqueName: \"kubernetes.io/projected/10309826-f6d9-49b0-a98c-1c31aab8ca7b-kube-api-access-chpnw\") pod \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.688232 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-catalog-content\") pod \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.688386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-utilities\") pod \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\" (UID: \"10309826-f6d9-49b0-a98c-1c31aab8ca7b\") " Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.689210 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-utilities" (OuterVolumeSpecName: "utilities") pod "10309826-f6d9-49b0-a98c-1c31aab8ca7b" (UID: "10309826-f6d9-49b0-a98c-1c31aab8ca7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.693648 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10309826-f6d9-49b0-a98c-1c31aab8ca7b-kube-api-access-chpnw" (OuterVolumeSpecName: "kube-api-access-chpnw") pod "10309826-f6d9-49b0-a98c-1c31aab8ca7b" (UID: "10309826-f6d9-49b0-a98c-1c31aab8ca7b"). InnerVolumeSpecName "kube-api-access-chpnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.739281 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10309826-f6d9-49b0-a98c-1c31aab8ca7b" (UID: "10309826-f6d9-49b0-a98c-1c31aab8ca7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.789745 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.789785 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chpnw\" (UniqueName: \"kubernetes.io/projected/10309826-f6d9-49b0-a98c-1c31aab8ca7b-kube-api-access-chpnw\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:08 crc kubenswrapper[4795]: I0219 23:12:08.789798 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10309826-f6d9-49b0-a98c-1c31aab8ca7b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.021921 4795 generic.go:334] "Generic (PLEG): container finished" podID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerID="56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459" exitCode=0 Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.021968 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnftg" event={"ID":"10309826-f6d9-49b0-a98c-1c31aab8ca7b","Type":"ContainerDied","Data":"56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459"} Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.022015 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnftg" event={"ID":"10309826-f6d9-49b0-a98c-1c31aab8ca7b","Type":"ContainerDied","Data":"8c9631e7617f04acaf017c583dcbddd280734220035ef66a47670736c26e2125"} Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.022042 4795 scope.go:117] "RemoveContainer" containerID="56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.021972 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mnftg" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.045209 4795 scope.go:117] "RemoveContainer" containerID="d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.069224 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mnftg"] Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.078237 4795 scope.go:117] "RemoveContainer" containerID="f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.079635 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mnftg"] Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.136139 4795 scope.go:117] "RemoveContainer" containerID="56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459" Feb 19 23:12:09 crc kubenswrapper[4795]: E0219 23:12:09.136623 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459\": container with ID starting with 56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459 not found: ID does not exist" containerID="56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.136777 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459"} err="failed to get container status \"56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459\": rpc error: code = NotFound desc = could not find container \"56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459\": container with ID starting with 56affc0ddd3dce1a56ac8545963c2e65c8ca749618a90d58aa77768c3fed3459 not found: ID does not exist" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.136889 4795 scope.go:117] "RemoveContainer" containerID="d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b" Feb 19 23:12:09 crc kubenswrapper[4795]: E0219 23:12:09.137407 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b\": container with ID starting with d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b not found: ID does not exist" containerID="d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.137428 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b"} err="failed to get container status \"d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b\": rpc error: code = NotFound desc = could not find container \"d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b\": container with ID starting with d4d65f6e8327aa14c977b5af3451b60be1db9d89c91b612be202232a40b62f1b not found: ID does not exist" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.137440 4795 scope.go:117] "RemoveContainer" containerID="f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20" Feb 19 23:12:09 crc kubenswrapper[4795]: E0219 23:12:09.137818 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20\": container with ID starting with f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20 not found: ID does not exist" containerID="f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.138055 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20"} err="failed to get container status \"f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20\": rpc error: code = NotFound desc = could not find container \"f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20\": container with ID starting with f39335613c4dde281877dd969604dd436c0e834307aa5a9f28116c9e90a7fe20 not found: ID does not exist" Feb 19 23:12:09 crc kubenswrapper[4795]: I0219 23:12:09.526326 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" path="/var/lib/kubelet/pods/10309826-f6d9-49b0-a98c-1c31aab8ca7b/volumes" Feb 19 23:12:28 crc kubenswrapper[4795]: I0219 23:12:28.427336 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:12:28 crc kubenswrapper[4795]: I0219 23:12:28.427955 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:12:43 crc kubenswrapper[4795]: I0219 23:12:43.041713 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-rfntz"] Feb 19 23:12:43 crc kubenswrapper[4795]: I0219 23:12:43.056134 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-rfntz"] Feb 19 23:12:43 crc kubenswrapper[4795]: I0219 23:12:43.526432 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7389820e-b641-4068-b624-af539a234699" path="/var/lib/kubelet/pods/7389820e-b641-4068-b624-af539a234699/volumes" Feb 19 23:12:44 crc kubenswrapper[4795]: I0219 23:12:44.027720 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-c873-account-create-update-fggql"] Feb 19 23:12:44 crc kubenswrapper[4795]: I0219 23:12:44.037773 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-c873-account-create-update-fggql"] Feb 19 23:12:45 crc kubenswrapper[4795]: I0219 23:12:45.521837 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ca0a783-4d18-4d0a-81d8-7cc1970379a9" path="/var/lib/kubelet/pods/9ca0a783-4d18-4d0a-81d8-7cc1970379a9/volumes" Feb 19 23:12:49 crc kubenswrapper[4795]: I0219 23:12:49.029800 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-2vlg7"] Feb 19 23:12:49 crc kubenswrapper[4795]: I0219 23:12:49.038689 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-2vlg7"] Feb 19 23:12:49 crc kubenswrapper[4795]: I0219 23:12:49.524344 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af5a019-2aa4-449d-a1a5-148cbf8a1ffa" path="/var/lib/kubelet/pods/2af5a019-2aa4-449d-a1a5-148cbf8a1ffa/volumes" Feb 19 23:12:50 crc kubenswrapper[4795]: I0219 23:12:50.029397 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-75e1-account-create-update-mq672"] Feb 19 23:12:50 crc kubenswrapper[4795]: I0219 23:12:50.039992 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-75e1-account-create-update-mq672"] Feb 19 23:12:51 crc kubenswrapper[4795]: I0219 23:12:51.527560 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2" path="/var/lib/kubelet/pods/0ea8f33a-4a23-4058-a6f1-ccd27d64f1f2/volumes" Feb 19 23:12:58 crc kubenswrapper[4795]: I0219 23:12:58.427636 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:12:58 crc kubenswrapper[4795]: I0219 23:12:58.429305 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:13:16 crc kubenswrapper[4795]: I0219 23:13:16.037296 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-z8cdz"] Feb 19 23:13:16 crc kubenswrapper[4795]: I0219 23:13:16.046388 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-z8cdz"] Feb 19 23:13:17 crc kubenswrapper[4795]: I0219 23:13:17.529296 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7895d70-3c78-4913-9028-75797e6e1dbd" path="/var/lib/kubelet/pods/c7895d70-3c78-4913-9028-75797e6e1dbd/volumes" Feb 19 23:13:26 crc kubenswrapper[4795]: I0219 23:13:26.633521 4795 scope.go:117] "RemoveContainer" containerID="700b20fbdb16cdb721d65035d59d19827c546e867ba32ed9f351235ca4bc0246" Feb 19 23:13:26 crc kubenswrapper[4795]: I0219 23:13:26.670663 4795 scope.go:117] "RemoveContainer" containerID="922d8fbba822aa06530b36452ecdfe6cce93b9221523be2934bb7556f81619e3" Feb 19 23:13:26 crc kubenswrapper[4795]: I0219 23:13:26.716987 4795 scope.go:117] "RemoveContainer" containerID="8783e1276005ef506acf0881371a237a591f18c217cbbd901f218637b5c95d2c" Feb 19 23:13:26 crc kubenswrapper[4795]: I0219 23:13:26.766859 4795 scope.go:117] "RemoveContainer" containerID="d4c8e37b4453cd9ee00d52e7178d48381bffc05905e787ed7678728d6f9cc0ef" Feb 19 23:13:26 crc kubenswrapper[4795]: I0219 23:13:26.844532 4795 scope.go:117] "RemoveContainer" containerID="461ac9425c34a3821048eb55409a0a70365c0acacbfa307ea4409068b90afe68" Feb 19 23:13:26 crc kubenswrapper[4795]: I0219 23:13:26.875469 4795 scope.go:117] "RemoveContainer" containerID="fe6c050cae9125e8669040d524bc8951c45b9effbc5921d0ee07f69c88d514a2" Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.427432 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.427797 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.427839 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.428640 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.428688 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" gracePeriod=600 Feb 19 23:13:28 crc kubenswrapper[4795]: E0219 23:13:28.551297 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.785700 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" exitCode=0 Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.785745 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4"} Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.785781 4795 scope.go:117] "RemoveContainer" containerID="b0fe96e51c4e702a3b8fdcd9d997ef35d626772b563d6c998f84fa6863685a9d" Feb 19 23:13:28 crc kubenswrapper[4795]: I0219 23:13:28.786570 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:13:28 crc kubenswrapper[4795]: E0219 23:13:28.787059 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:13:44 crc kubenswrapper[4795]: I0219 23:13:44.511317 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:13:44 crc kubenswrapper[4795]: E0219 23:13:44.512101 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:13:58 crc kubenswrapper[4795]: I0219 23:13:58.511966 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:13:58 crc kubenswrapper[4795]: E0219 23:13:58.514109 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:14:10 crc kubenswrapper[4795]: I0219 23:14:10.511721 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:14:10 crc kubenswrapper[4795]: E0219 23:14:10.512506 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:14:22 crc kubenswrapper[4795]: I0219 23:14:22.513017 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:14:22 crc kubenswrapper[4795]: E0219 23:14:22.513807 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.047303 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh4m"] Feb 19 23:14:29 crc kubenswrapper[4795]: E0219 23:14:29.049323 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerName="extract-content" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.049456 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerName="extract-content" Feb 19 23:14:29 crc kubenswrapper[4795]: E0219 23:14:29.049547 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerName="registry-server" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.049638 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerName="registry-server" Feb 19 23:14:29 crc kubenswrapper[4795]: E0219 23:14:29.049765 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerName="extract-utilities" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.049842 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerName="extract-utilities" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.050188 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="10309826-f6d9-49b0-a98c-1c31aab8ca7b" containerName="registry-server" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.060643 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.086654 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh4m"] Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.168375 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-utilities\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.168716 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-catalog-content\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.168879 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsgf9\" (UniqueName: \"kubernetes.io/projected/453f74c8-d1b5-4e9f-b405-0341eead8a87-kube-api-access-dsgf9\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.272536 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-utilities\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.272581 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-catalog-content\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.272647 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsgf9\" (UniqueName: \"kubernetes.io/projected/453f74c8-d1b5-4e9f-b405-0341eead8a87-kube-api-access-dsgf9\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.273045 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-utilities\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.273121 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-catalog-content\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.290070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsgf9\" (UniqueName: \"kubernetes.io/projected/453f74c8-d1b5-4e9f-b405-0341eead8a87-kube-api-access-dsgf9\") pod \"redhat-marketplace-jvh4m\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.388184 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:29 crc kubenswrapper[4795]: I0219 23:14:29.862093 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh4m"] Feb 19 23:14:30 crc kubenswrapper[4795]: I0219 23:14:30.382663 4795 generic.go:334] "Generic (PLEG): container finished" podID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerID="62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535" exitCode=0 Feb 19 23:14:30 crc kubenswrapper[4795]: I0219 23:14:30.382731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh4m" event={"ID":"453f74c8-d1b5-4e9f-b405-0341eead8a87","Type":"ContainerDied","Data":"62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535"} Feb 19 23:14:30 crc kubenswrapper[4795]: I0219 23:14:30.382979 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh4m" event={"ID":"453f74c8-d1b5-4e9f-b405-0341eead8a87","Type":"ContainerStarted","Data":"50d08332a8251ec3f48c5be46a354c07cc40d484277909a9ca772e3ee10ef0b2"} Feb 19 23:14:30 crc kubenswrapper[4795]: I0219 23:14:30.384869 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:14:31 crc kubenswrapper[4795]: I0219 23:14:31.393769 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh4m" event={"ID":"453f74c8-d1b5-4e9f-b405-0341eead8a87","Type":"ContainerStarted","Data":"4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c"} Feb 19 23:14:32 crc kubenswrapper[4795]: I0219 23:14:32.406363 4795 generic.go:334] "Generic (PLEG): container finished" podID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerID="4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c" exitCode=0 Feb 19 23:14:32 crc kubenswrapper[4795]: I0219 23:14:32.406442 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh4m" event={"ID":"453f74c8-d1b5-4e9f-b405-0341eead8a87","Type":"ContainerDied","Data":"4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c"} Feb 19 23:14:33 crc kubenswrapper[4795]: I0219 23:14:33.417897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh4m" event={"ID":"453f74c8-d1b5-4e9f-b405-0341eead8a87","Type":"ContainerStarted","Data":"b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241"} Feb 19 23:14:33 crc kubenswrapper[4795]: I0219 23:14:33.443854 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jvh4m" podStartSLOduration=2.005029077 podStartE2EDuration="4.443836364s" podCreationTimestamp="2026-02-19 23:14:29 +0000 UTC" firstStartedPulling="2026-02-19 23:14:30.384677983 +0000 UTC m=+6381.577195847" lastFinishedPulling="2026-02-19 23:14:32.82348527 +0000 UTC m=+6384.016003134" observedRunningTime="2026-02-19 23:14:33.43551894 +0000 UTC m=+6384.628036824" watchObservedRunningTime="2026-02-19 23:14:33.443836364 +0000 UTC m=+6384.636354218" Feb 19 23:14:35 crc kubenswrapper[4795]: I0219 23:14:35.512264 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:14:35 crc kubenswrapper[4795]: E0219 23:14:35.512947 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:14:39 crc kubenswrapper[4795]: I0219 23:14:39.389745 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:39 crc kubenswrapper[4795]: I0219 23:14:39.390406 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:39 crc kubenswrapper[4795]: I0219 23:14:39.463149 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:39 crc kubenswrapper[4795]: I0219 23:14:39.555385 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:39 crc kubenswrapper[4795]: I0219 23:14:39.706509 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh4m"] Feb 19 23:14:41 crc kubenswrapper[4795]: I0219 23:14:41.493948 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jvh4m" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerName="registry-server" containerID="cri-o://b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241" gracePeriod=2 Feb 19 23:14:41 crc kubenswrapper[4795]: I0219 23:14:41.995623 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.160975 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-catalog-content\") pod \"453f74c8-d1b5-4e9f-b405-0341eead8a87\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.161050 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-utilities\") pod \"453f74c8-d1b5-4e9f-b405-0341eead8a87\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.161222 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsgf9\" (UniqueName: \"kubernetes.io/projected/453f74c8-d1b5-4e9f-b405-0341eead8a87-kube-api-access-dsgf9\") pod \"453f74c8-d1b5-4e9f-b405-0341eead8a87\" (UID: \"453f74c8-d1b5-4e9f-b405-0341eead8a87\") " Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.162281 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-utilities" (OuterVolumeSpecName: "utilities") pod "453f74c8-d1b5-4e9f-b405-0341eead8a87" (UID: "453f74c8-d1b5-4e9f-b405-0341eead8a87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.168040 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/453f74c8-d1b5-4e9f-b405-0341eead8a87-kube-api-access-dsgf9" (OuterVolumeSpecName: "kube-api-access-dsgf9") pod "453f74c8-d1b5-4e9f-b405-0341eead8a87" (UID: "453f74c8-d1b5-4e9f-b405-0341eead8a87"). InnerVolumeSpecName "kube-api-access-dsgf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.183501 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "453f74c8-d1b5-4e9f-b405-0341eead8a87" (UID: "453f74c8-d1b5-4e9f-b405-0341eead8a87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.264717 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.264760 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453f74c8-d1b5-4e9f-b405-0341eead8a87-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.264776 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsgf9\" (UniqueName: \"kubernetes.io/projected/453f74c8-d1b5-4e9f-b405-0341eead8a87-kube-api-access-dsgf9\") on node \"crc\" DevicePath \"\"" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.508273 4795 generic.go:334] "Generic (PLEG): container finished" podID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerID="b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241" exitCode=0 Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.508326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh4m" event={"ID":"453f74c8-d1b5-4e9f-b405-0341eead8a87","Type":"ContainerDied","Data":"b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241"} Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.508350 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvh4m" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.508371 4795 scope.go:117] "RemoveContainer" containerID="b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.508357 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh4m" event={"ID":"453f74c8-d1b5-4e9f-b405-0341eead8a87","Type":"ContainerDied","Data":"50d08332a8251ec3f48c5be46a354c07cc40d484277909a9ca772e3ee10ef0b2"} Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.554712 4795 scope.go:117] "RemoveContainer" containerID="4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.557476 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh4m"] Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.571896 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh4m"] Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.580457 4795 scope.go:117] "RemoveContainer" containerID="62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.639909 4795 scope.go:117] "RemoveContainer" containerID="b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241" Feb 19 23:14:42 crc kubenswrapper[4795]: E0219 23:14:42.640415 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241\": container with ID starting with b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241 not found: ID does not exist" containerID="b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.640454 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241"} err="failed to get container status \"b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241\": rpc error: code = NotFound desc = could not find container \"b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241\": container with ID starting with b62b2acf00fa6b1a617728bd14c7220c9a57c93c0808377409db315c3214c241 not found: ID does not exist" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.640498 4795 scope.go:117] "RemoveContainer" containerID="4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c" Feb 19 23:14:42 crc kubenswrapper[4795]: E0219 23:14:42.640766 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c\": container with ID starting with 4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c not found: ID does not exist" containerID="4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.640808 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c"} err="failed to get container status \"4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c\": rpc error: code = NotFound desc = could not find container \"4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c\": container with ID starting with 4546fb2b0f692e0afbd8f9a2c0d0da64c1561e2b5959bb521a8e4212817a8c2c not found: ID does not exist" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.640840 4795 scope.go:117] "RemoveContainer" containerID="62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535" Feb 19 23:14:42 crc kubenswrapper[4795]: E0219 23:14:42.641271 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535\": container with ID starting with 62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535 not found: ID does not exist" containerID="62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535" Feb 19 23:14:42 crc kubenswrapper[4795]: I0219 23:14:42.641318 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535"} err="failed to get container status \"62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535\": rpc error: code = NotFound desc = could not find container \"62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535\": container with ID starting with 62121896a47c05580cb59ecb2938a3458503ebc87eafd5ef41f948ed9909b535 not found: ID does not exist" Feb 19 23:14:43 crc kubenswrapper[4795]: I0219 23:14:43.526767 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" path="/var/lib/kubelet/pods/453f74c8-d1b5-4e9f-b405-0341eead8a87/volumes" Feb 19 23:14:49 crc kubenswrapper[4795]: I0219 23:14:49.518202 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:14:49 crc kubenswrapper[4795]: E0219 23:14:49.518931 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.163623 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95"] Feb 19 23:15:00 crc kubenswrapper[4795]: E0219 23:15:00.164817 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerName="registry-server" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.164837 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerName="registry-server" Feb 19 23:15:00 crc kubenswrapper[4795]: E0219 23:15:00.164900 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerName="extract-content" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.164910 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerName="extract-content" Feb 19 23:15:00 crc kubenswrapper[4795]: E0219 23:15:00.164931 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerName="extract-utilities" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.164941 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerName="extract-utilities" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.165305 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="453f74c8-d1b5-4e9f-b405-0341eead8a87" containerName="registry-server" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.166413 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.175573 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.175842 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.179994 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95"] Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.198284 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eafa182-621e-48fe-a019-360c2f94c212-secret-volume\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.198325 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eafa182-621e-48fe-a019-360c2f94c212-config-volume\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.198364 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsdb9\" (UniqueName: \"kubernetes.io/projected/3eafa182-621e-48fe-a019-360c2f94c212-kube-api-access-xsdb9\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.299614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eafa182-621e-48fe-a019-360c2f94c212-secret-volume\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.299676 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eafa182-621e-48fe-a019-360c2f94c212-config-volume\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.299726 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsdb9\" (UniqueName: \"kubernetes.io/projected/3eafa182-621e-48fe-a019-360c2f94c212-kube-api-access-xsdb9\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.300665 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eafa182-621e-48fe-a019-360c2f94c212-config-volume\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.308189 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eafa182-621e-48fe-a019-360c2f94c212-secret-volume\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.315466 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsdb9\" (UniqueName: \"kubernetes.io/projected/3eafa182-621e-48fe-a019-360c2f94c212-kube-api-access-xsdb9\") pod \"collect-profiles-29525715-mrk95\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.498820 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:00 crc kubenswrapper[4795]: I0219 23:15:00.989397 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95"] Feb 19 23:15:00 crc kubenswrapper[4795]: W0219 23:15:00.989643 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eafa182_621e_48fe_a019_360c2f94c212.slice/crio-6406f99a63ec383df0dd1e873de85230266833554c0a7f03d3954fe835bc1010 WatchSource:0}: Error finding container 6406f99a63ec383df0dd1e873de85230266833554c0a7f03d3954fe835bc1010: Status 404 returned error can't find the container with id 6406f99a63ec383df0dd1e873de85230266833554c0a7f03d3954fe835bc1010 Feb 19 23:15:01 crc kubenswrapper[4795]: I0219 23:15:01.512426 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:15:01 crc kubenswrapper[4795]: E0219 23:15:01.513221 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:15:01 crc kubenswrapper[4795]: I0219 23:15:01.724999 4795 generic.go:334] "Generic (PLEG): container finished" podID="3eafa182-621e-48fe-a019-360c2f94c212" containerID="506e26301ea1d8c97cb2f36560ab4d1582f8a10338a4a109885babb88591cfe0" exitCode=0 Feb 19 23:15:01 crc kubenswrapper[4795]: I0219 23:15:01.725047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" event={"ID":"3eafa182-621e-48fe-a019-360c2f94c212","Type":"ContainerDied","Data":"506e26301ea1d8c97cb2f36560ab4d1582f8a10338a4a109885babb88591cfe0"} Feb 19 23:15:01 crc kubenswrapper[4795]: I0219 23:15:01.725105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" event={"ID":"3eafa182-621e-48fe-a019-360c2f94c212","Type":"ContainerStarted","Data":"6406f99a63ec383df0dd1e873de85230266833554c0a7f03d3954fe835bc1010"} Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.088765 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.259796 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eafa182-621e-48fe-a019-360c2f94c212-config-volume\") pod \"3eafa182-621e-48fe-a019-360c2f94c212\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.260217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eafa182-621e-48fe-a019-360c2f94c212-secret-volume\") pod \"3eafa182-621e-48fe-a019-360c2f94c212\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.260353 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsdb9\" (UniqueName: \"kubernetes.io/projected/3eafa182-621e-48fe-a019-360c2f94c212-kube-api-access-xsdb9\") pod \"3eafa182-621e-48fe-a019-360c2f94c212\" (UID: \"3eafa182-621e-48fe-a019-360c2f94c212\") " Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.261009 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eafa182-621e-48fe-a019-360c2f94c212-config-volume" (OuterVolumeSpecName: "config-volume") pod "3eafa182-621e-48fe-a019-360c2f94c212" (UID: "3eafa182-621e-48fe-a019-360c2f94c212"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.261917 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eafa182-621e-48fe-a019-360c2f94c212-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.265818 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eafa182-621e-48fe-a019-360c2f94c212-kube-api-access-xsdb9" (OuterVolumeSpecName: "kube-api-access-xsdb9") pod "3eafa182-621e-48fe-a019-360c2f94c212" (UID: "3eafa182-621e-48fe-a019-360c2f94c212"). InnerVolumeSpecName "kube-api-access-xsdb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.265929 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eafa182-621e-48fe-a019-360c2f94c212-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3eafa182-621e-48fe-a019-360c2f94c212" (UID: "3eafa182-621e-48fe-a019-360c2f94c212"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.364048 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eafa182-621e-48fe-a019-360c2f94c212-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.364535 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsdb9\" (UniqueName: \"kubernetes.io/projected/3eafa182-621e-48fe-a019-360c2f94c212-kube-api-access-xsdb9\") on node \"crc\" DevicePath \"\"" Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.745637 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" event={"ID":"3eafa182-621e-48fe-a019-360c2f94c212","Type":"ContainerDied","Data":"6406f99a63ec383df0dd1e873de85230266833554c0a7f03d3954fe835bc1010"} Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.745685 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6406f99a63ec383df0dd1e873de85230266833554c0a7f03d3954fe835bc1010" Feb 19 23:15:03 crc kubenswrapper[4795]: I0219 23:15:03.746045 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95" Feb 19 23:15:04 crc kubenswrapper[4795]: I0219 23:15:04.165147 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr"] Feb 19 23:15:04 crc kubenswrapper[4795]: I0219 23:15:04.173358 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525670-bx5qr"] Feb 19 23:15:05 crc kubenswrapper[4795]: I0219 23:15:05.527935 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8d7fc5a-2c38-45d1-92d4-e30329082e49" path="/var/lib/kubelet/pods/e8d7fc5a-2c38-45d1-92d4-e30329082e49/volumes" Feb 19 23:15:15 crc kubenswrapper[4795]: I0219 23:15:15.516370 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:15:15 crc kubenswrapper[4795]: E0219 23:15:15.517622 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:15:26 crc kubenswrapper[4795]: I0219 23:15:26.513579 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:15:26 crc kubenswrapper[4795]: E0219 23:15:26.515019 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:15:27 crc kubenswrapper[4795]: I0219 23:15:27.046975 4795 scope.go:117] "RemoveContainer" containerID="2891e05af0080148a30c661d705a64987123339160913040e4d09a5170f489c1" Feb 19 23:15:39 crc kubenswrapper[4795]: I0219 23:15:39.525268 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:15:39 crc kubenswrapper[4795]: E0219 23:15:39.526024 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:15:54 crc kubenswrapper[4795]: I0219 23:15:54.511503 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:15:54 crc kubenswrapper[4795]: E0219 23:15:54.512399 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:16:07 crc kubenswrapper[4795]: I0219 23:16:07.512112 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:16:07 crc kubenswrapper[4795]: E0219 23:16:07.512935 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:16:09 crc kubenswrapper[4795]: I0219 23:16:09.062353 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-pvsmv"] Feb 19 23:16:09 crc kubenswrapper[4795]: I0219 23:16:09.071993 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-4351-account-create-update-7cp54"] Feb 19 23:16:09 crc kubenswrapper[4795]: I0219 23:16:09.080654 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-pvsmv"] Feb 19 23:16:09 crc kubenswrapper[4795]: I0219 23:16:09.088692 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-4351-account-create-update-7cp54"] Feb 19 23:16:09 crc kubenswrapper[4795]: I0219 23:16:09.526821 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ce3f6fb-8688-4e53-8d30-e6c7edbf5636" path="/var/lib/kubelet/pods/3ce3f6fb-8688-4e53-8d30-e6c7edbf5636/volumes" Feb 19 23:16:09 crc kubenswrapper[4795]: I0219 23:16:09.527683 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7049350-2c57-49c2-aef7-b9f0bd28abfc" path="/var/lib/kubelet/pods/c7049350-2c57-49c2-aef7-b9f0bd28abfc/volumes" Feb 19 23:16:21 crc kubenswrapper[4795]: I0219 23:16:21.511903 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:16:21 crc kubenswrapper[4795]: E0219 23:16:21.513473 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:16:22 crc kubenswrapper[4795]: I0219 23:16:22.041795 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-wlhqm"] Feb 19 23:16:22 crc kubenswrapper[4795]: I0219 23:16:22.051927 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-wlhqm"] Feb 19 23:16:23 crc kubenswrapper[4795]: I0219 23:16:23.525939 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="497c4c82-13ae-430c-83bd-1f1c4d4683e4" path="/var/lib/kubelet/pods/497c4c82-13ae-430c-83bd-1f1c4d4683e4/volumes" Feb 19 23:16:27 crc kubenswrapper[4795]: I0219 23:16:27.149411 4795 scope.go:117] "RemoveContainer" containerID="26cce21cbd7189a101a6a725aa7d2a769b17bb5f8957270015deb5068ba381a3" Feb 19 23:16:27 crc kubenswrapper[4795]: I0219 23:16:27.179782 4795 scope.go:117] "RemoveContainer" containerID="be3b7d10ee8dbba79631201a8d5da4057d99e4383e860152ba77db4992ff52fc" Feb 19 23:16:27 crc kubenswrapper[4795]: I0219 23:16:27.233796 4795 scope.go:117] "RemoveContainer" containerID="394778543b7d586f5d57eb0c386c1afcba774c0cbc8858b3c036d9d786189525" Feb 19 23:16:34 crc kubenswrapper[4795]: I0219 23:16:34.511286 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:16:34 crc kubenswrapper[4795]: E0219 23:16:34.512035 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:16:47 crc kubenswrapper[4795]: I0219 23:16:47.512814 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:16:47 crc kubenswrapper[4795]: E0219 23:16:47.513933 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:17:02 crc kubenswrapper[4795]: I0219 23:17:02.512061 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:17:02 crc kubenswrapper[4795]: E0219 23:17:02.513011 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:17:14 crc kubenswrapper[4795]: I0219 23:17:14.512615 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:17:14 crc kubenswrapper[4795]: E0219 23:17:14.513732 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:17:25 crc kubenswrapper[4795]: I0219 23:17:25.512055 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:17:25 crc kubenswrapper[4795]: E0219 23:17:25.513834 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:17:36 crc kubenswrapper[4795]: I0219 23:17:36.511782 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:17:36 crc kubenswrapper[4795]: E0219 23:17:36.512536 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:17:51 crc kubenswrapper[4795]: I0219 23:17:51.518557 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:17:51 crc kubenswrapper[4795]: E0219 23:17:51.519880 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:18:06 crc kubenswrapper[4795]: I0219 23:18:06.512100 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:18:06 crc kubenswrapper[4795]: E0219 23:18:06.513148 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:18:17 crc kubenswrapper[4795]: I0219 23:18:17.513307 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:18:17 crc kubenswrapper[4795]: E0219 23:18:17.514049 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:18:25 crc kubenswrapper[4795]: I0219 23:18:25.038982 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-jll2l"] Feb 19 23:18:25 crc kubenswrapper[4795]: I0219 23:18:25.056930 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-aa86-account-create-update-n8xdq"] Feb 19 23:18:25 crc kubenswrapper[4795]: I0219 23:18:25.066538 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-jll2l"] Feb 19 23:18:25 crc kubenswrapper[4795]: I0219 23:18:25.074968 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-aa86-account-create-update-n8xdq"] Feb 19 23:18:25 crc kubenswrapper[4795]: I0219 23:18:25.522448 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3086733-54e4-4041-9896-88f6df519492" path="/var/lib/kubelet/pods/d3086733-54e4-4041-9896-88f6df519492/volumes" Feb 19 23:18:25 crc kubenswrapper[4795]: I0219 23:18:25.523071 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8" path="/var/lib/kubelet/pods/d3ad8082-2f7c-4c51-ac6d-f6121f30d0c8/volumes" Feb 19 23:18:27 crc kubenswrapper[4795]: I0219 23:18:27.351301 4795 scope.go:117] "RemoveContainer" containerID="49e34c186890a85782e2cf1f05ffa268eb8918522484c1ad1e090793c43863fa" Feb 19 23:18:27 crc kubenswrapper[4795]: I0219 23:18:27.378955 4795 scope.go:117] "RemoveContainer" containerID="6653ee424e57053f6b0308afd986889a06d2949741f5136682c4b14068b5b724" Feb 19 23:18:30 crc kubenswrapper[4795]: I0219 23:18:30.512093 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:18:30 crc kubenswrapper[4795]: I0219 23:18:30.873240 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"ad9ce3f3f0f6a8b1730c68641f9c3a9fd43cad22ef4d316d4a9d380b0b12e9e5"} Feb 19 23:18:35 crc kubenswrapper[4795]: I0219 23:18:35.027704 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-q2lkk"] Feb 19 23:18:35 crc kubenswrapper[4795]: I0219 23:18:35.035448 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-q2lkk"] Feb 19 23:18:35 crc kubenswrapper[4795]: I0219 23:18:35.522522 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1953fbb-b558-497f-b889-62b41f35e4b4" path="/var/lib/kubelet/pods/e1953fbb-b558-497f-b889-62b41f35e4b4/volumes" Feb 19 23:18:53 crc kubenswrapper[4795]: I0219 23:18:53.041203 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-4647-account-create-update-f7k78"] Feb 19 23:18:53 crc kubenswrapper[4795]: I0219 23:18:53.051494 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-657cv"] Feb 19 23:18:53 crc kubenswrapper[4795]: I0219 23:18:53.060679 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-657cv"] Feb 19 23:18:53 crc kubenswrapper[4795]: I0219 23:18:53.068103 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-4647-account-create-update-f7k78"] Feb 19 23:18:53 crc kubenswrapper[4795]: I0219 23:18:53.524558 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce" path="/var/lib/kubelet/pods/4a6dde49-b59e-4a4a-ad84-6386aa1dc6ce/volumes" Feb 19 23:18:53 crc kubenswrapper[4795]: I0219 23:18:53.526610 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eadcaa3c-623a-409a-b735-2a38854c8036" path="/var/lib/kubelet/pods/eadcaa3c-623a-409a-b735-2a38854c8036/volumes" Feb 19 23:19:05 crc kubenswrapper[4795]: I0219 23:19:05.055622 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-8fzxs"] Feb 19 23:19:05 crc kubenswrapper[4795]: I0219 23:19:05.070610 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-8fzxs"] Feb 19 23:19:05 crc kubenswrapper[4795]: I0219 23:19:05.546744 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd19113b-623e-4f3e-8392-09968a5d71f9" path="/var/lib/kubelet/pods/bd19113b-623e-4f3e-8392-09968a5d71f9/volumes" Feb 19 23:19:27 crc kubenswrapper[4795]: I0219 23:19:27.478070 4795 scope.go:117] "RemoveContainer" containerID="2537d15feb1a2c6f922f8d39bb6ed8ef442cb362fc714c8399a87d6d69b4784d" Feb 19 23:19:27 crc kubenswrapper[4795]: I0219 23:19:27.512095 4795 scope.go:117] "RemoveContainer" containerID="47bda409c5d13cad14cf54811039c8544c7b3358bb1de45385c9a60e61a75704" Feb 19 23:19:27 crc kubenswrapper[4795]: I0219 23:19:27.581816 4795 scope.go:117] "RemoveContainer" containerID="74c8a86b466b15b56eddcfa8140418aa598bd95ce7f01643ab2976a1bd25cfe5" Feb 19 23:19:27 crc kubenswrapper[4795]: I0219 23:19:27.624383 4795 scope.go:117] "RemoveContainer" containerID="a15bc43318a5f56769ebc9469003d0ddb425e341f35a3faca3d202f2707e3b2d" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.020972 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qvzdn"] Feb 19 23:20:39 crc kubenswrapper[4795]: E0219 23:20:39.021909 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eafa182-621e-48fe-a019-360c2f94c212" containerName="collect-profiles" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.021924 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eafa182-621e-48fe-a019-360c2f94c212" containerName="collect-profiles" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.022140 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eafa182-621e-48fe-a019-360c2f94c212" containerName="collect-profiles" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.023851 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.045127 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qvzdn"] Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.079073 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-catalog-content\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.079367 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-utilities\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.079503 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9hvq\" (UniqueName: \"kubernetes.io/projected/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-kube-api-access-g9hvq\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.182316 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-catalog-content\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.182392 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-utilities\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.182429 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9hvq\" (UniqueName: \"kubernetes.io/projected/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-kube-api-access-g9hvq\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.182921 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-utilities\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.183032 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-catalog-content\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.218888 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9hvq\" (UniqueName: \"kubernetes.io/projected/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-kube-api-access-g9hvq\") pod \"community-operators-qvzdn\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.354946 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:39 crc kubenswrapper[4795]: I0219 23:20:39.922956 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qvzdn"] Feb 19 23:20:40 crc kubenswrapper[4795]: I0219 23:20:40.206291 4795 generic.go:334] "Generic (PLEG): container finished" podID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerID="90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930" exitCode=0 Feb 19 23:20:40 crc kubenswrapper[4795]: I0219 23:20:40.206344 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvzdn" event={"ID":"27d6fc33-0869-4db3-8e1d-ce352d33d9cb","Type":"ContainerDied","Data":"90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930"} Feb 19 23:20:40 crc kubenswrapper[4795]: I0219 23:20:40.206591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvzdn" event={"ID":"27d6fc33-0869-4db3-8e1d-ce352d33d9cb","Type":"ContainerStarted","Data":"123360a20adc433e46230a3d86ab168ba8be10b64f023d54e7c7d020d514ec34"} Feb 19 23:20:40 crc kubenswrapper[4795]: I0219 23:20:40.209013 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:20:41 crc kubenswrapper[4795]: I0219 23:20:41.218834 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvzdn" event={"ID":"27d6fc33-0869-4db3-8e1d-ce352d33d9cb","Type":"ContainerStarted","Data":"9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6"} Feb 19 23:20:43 crc kubenswrapper[4795]: I0219 23:20:43.237719 4795 generic.go:334] "Generic (PLEG): container finished" podID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerID="9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6" exitCode=0 Feb 19 23:20:43 crc kubenswrapper[4795]: I0219 23:20:43.237764 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvzdn" event={"ID":"27d6fc33-0869-4db3-8e1d-ce352d33d9cb","Type":"ContainerDied","Data":"9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6"} Feb 19 23:20:44 crc kubenswrapper[4795]: I0219 23:20:44.249910 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvzdn" event={"ID":"27d6fc33-0869-4db3-8e1d-ce352d33d9cb","Type":"ContainerStarted","Data":"94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e"} Feb 19 23:20:44 crc kubenswrapper[4795]: I0219 23:20:44.269230 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qvzdn" podStartSLOduration=2.820813672 podStartE2EDuration="6.269207535s" podCreationTimestamp="2026-02-19 23:20:38 +0000 UTC" firstStartedPulling="2026-02-19 23:20:40.208824513 +0000 UTC m=+6751.401342377" lastFinishedPulling="2026-02-19 23:20:43.657218386 +0000 UTC m=+6754.849736240" observedRunningTime="2026-02-19 23:20:44.265638635 +0000 UTC m=+6755.458156499" watchObservedRunningTime="2026-02-19 23:20:44.269207535 +0000 UTC m=+6755.461725399" Feb 19 23:20:49 crc kubenswrapper[4795]: I0219 23:20:49.355271 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:49 crc kubenswrapper[4795]: I0219 23:20:49.355935 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:49 crc kubenswrapper[4795]: I0219 23:20:49.423312 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:50 crc kubenswrapper[4795]: I0219 23:20:50.364183 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:50 crc kubenswrapper[4795]: I0219 23:20:50.422886 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qvzdn"] Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.329067 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qvzdn" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerName="registry-server" containerID="cri-o://94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e" gracePeriod=2 Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.857070 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.891718 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-utilities\") pod \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.891855 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-catalog-content\") pod \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.891900 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9hvq\" (UniqueName: \"kubernetes.io/projected/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-kube-api-access-g9hvq\") pod \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\" (UID: \"27d6fc33-0869-4db3-8e1d-ce352d33d9cb\") " Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.892481 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-utilities" (OuterVolumeSpecName: "utilities") pod "27d6fc33-0869-4db3-8e1d-ce352d33d9cb" (UID: "27d6fc33-0869-4db3-8e1d-ce352d33d9cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.892860 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.898385 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-kube-api-access-g9hvq" (OuterVolumeSpecName: "kube-api-access-g9hvq") pod "27d6fc33-0869-4db3-8e1d-ce352d33d9cb" (UID: "27d6fc33-0869-4db3-8e1d-ce352d33d9cb"). InnerVolumeSpecName "kube-api-access-g9hvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.946399 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27d6fc33-0869-4db3-8e1d-ce352d33d9cb" (UID: "27d6fc33-0869-4db3-8e1d-ce352d33d9cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.994249 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:20:52 crc kubenswrapper[4795]: I0219 23:20:52.994278 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9hvq\" (UniqueName: \"kubernetes.io/projected/27d6fc33-0869-4db3-8e1d-ce352d33d9cb-kube-api-access-g9hvq\") on node \"crc\" DevicePath \"\"" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.340798 4795 generic.go:334] "Generic (PLEG): container finished" podID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerID="94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e" exitCode=0 Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.340872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvzdn" event={"ID":"27d6fc33-0869-4db3-8e1d-ce352d33d9cb","Type":"ContainerDied","Data":"94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e"} Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.342145 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvzdn" event={"ID":"27d6fc33-0869-4db3-8e1d-ce352d33d9cb","Type":"ContainerDied","Data":"123360a20adc433e46230a3d86ab168ba8be10b64f023d54e7c7d020d514ec34"} Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.340914 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qvzdn" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.342236 4795 scope.go:117] "RemoveContainer" containerID="94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.365827 4795 scope.go:117] "RemoveContainer" containerID="9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.387557 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qvzdn"] Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.400716 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qvzdn"] Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.415488 4795 scope.go:117] "RemoveContainer" containerID="90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.451511 4795 scope.go:117] "RemoveContainer" containerID="94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e" Feb 19 23:20:53 crc kubenswrapper[4795]: E0219 23:20:53.451989 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e\": container with ID starting with 94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e not found: ID does not exist" containerID="94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.452018 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e"} err="failed to get container status \"94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e\": rpc error: code = NotFound desc = could not find container \"94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e\": container with ID starting with 94823ab5e3f75d119d89e78533d723b0a182b5fc409cd7a722cff54a7a7b4c4e not found: ID does not exist" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.452037 4795 scope.go:117] "RemoveContainer" containerID="9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6" Feb 19 23:20:53 crc kubenswrapper[4795]: E0219 23:20:53.452557 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6\": container with ID starting with 9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6 not found: ID does not exist" containerID="9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.452597 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6"} err="failed to get container status \"9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6\": rpc error: code = NotFound desc = could not find container \"9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6\": container with ID starting with 9502aeb2932f638771dcbfbf5feb77ab738d50770951d8f7ad44433899b319c6 not found: ID does not exist" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.452620 4795 scope.go:117] "RemoveContainer" containerID="90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930" Feb 19 23:20:53 crc kubenswrapper[4795]: E0219 23:20:53.452881 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930\": container with ID starting with 90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930 not found: ID does not exist" containerID="90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.452909 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930"} err="failed to get container status \"90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930\": rpc error: code = NotFound desc = could not find container \"90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930\": container with ID starting with 90b7b74d7421043ee57f26dd4154d02ea41a8e86526c38c70cf79a9ef21e8930 not found: ID does not exist" Feb 19 23:20:53 crc kubenswrapper[4795]: I0219 23:20:53.553267 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" path="/var/lib/kubelet/pods/27d6fc33-0869-4db3-8e1d-ce352d33d9cb/volumes" Feb 19 23:20:58 crc kubenswrapper[4795]: I0219 23:20:58.427526 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:20:58 crc kubenswrapper[4795]: I0219 23:20:58.428056 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:21:28 crc kubenswrapper[4795]: I0219 23:21:28.427738 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:21:28 crc kubenswrapper[4795]: I0219 23:21:28.428519 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.427466 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.428085 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.428134 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.429083 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad9ce3f3f0f6a8b1730c68641f9c3a9fd43cad22ef4d316d4a9d380b0b12e9e5"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.429150 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://ad9ce3f3f0f6a8b1730c68641f9c3a9fd43cad22ef4d316d4a9d380b0b12e9e5" gracePeriod=600 Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.979141 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="ad9ce3f3f0f6a8b1730c68641f9c3a9fd43cad22ef4d316d4a9d380b0b12e9e5" exitCode=0 Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.979305 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"ad9ce3f3f0f6a8b1730c68641f9c3a9fd43cad22ef4d316d4a9d380b0b12e9e5"} Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.979550 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02"} Feb 19 23:21:58 crc kubenswrapper[4795]: I0219 23:21:58.979578 4795 scope.go:117] "RemoveContainer" containerID="ee7132fbe9aafdbd5a750ec85268f079d3e663b545f9aa0997d8e4efe3faebb4" Feb 19 23:22:04 crc kubenswrapper[4795]: I0219 23:22:04.048623 4795 generic.go:334] "Generic (PLEG): container finished" podID="c3cbdd11-d93f-4025-9c08-7530a68f6113" containerID="587260c6871d5ac2af0ecde5796363d9253fc80e5d5b060bd7a191b9752a30d4" exitCode=0 Feb 19 23:22:04 crc kubenswrapper[4795]: I0219 23:22:04.048713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" event={"ID":"c3cbdd11-d93f-4025-9c08-7530a68f6113","Type":"ContainerDied","Data":"587260c6871d5ac2af0ecde5796363d9253fc80e5d5b060bd7a191b9752a30d4"} Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.575353 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.708878 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-tripleo-cleanup-combined-ca-bundle\") pod \"c3cbdd11-d93f-4025-9c08-7530a68f6113\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.708933 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh9wf\" (UniqueName: \"kubernetes.io/projected/c3cbdd11-d93f-4025-9c08-7530a68f6113-kube-api-access-hh9wf\") pod \"c3cbdd11-d93f-4025-9c08-7530a68f6113\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.708963 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ceph\") pod \"c3cbdd11-d93f-4025-9c08-7530a68f6113\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.709239 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-inventory\") pod \"c3cbdd11-d93f-4025-9c08-7530a68f6113\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.709296 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ssh-key-openstack-cell1\") pod \"c3cbdd11-d93f-4025-9c08-7530a68f6113\" (UID: \"c3cbdd11-d93f-4025-9c08-7530a68f6113\") " Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.714644 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "c3cbdd11-d93f-4025-9c08-7530a68f6113" (UID: "c3cbdd11-d93f-4025-9c08-7530a68f6113"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.714758 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ceph" (OuterVolumeSpecName: "ceph") pod "c3cbdd11-d93f-4025-9c08-7530a68f6113" (UID: "c3cbdd11-d93f-4025-9c08-7530a68f6113"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.715218 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3cbdd11-d93f-4025-9c08-7530a68f6113-kube-api-access-hh9wf" (OuterVolumeSpecName: "kube-api-access-hh9wf") pod "c3cbdd11-d93f-4025-9c08-7530a68f6113" (UID: "c3cbdd11-d93f-4025-9c08-7530a68f6113"). InnerVolumeSpecName "kube-api-access-hh9wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.736553 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-inventory" (OuterVolumeSpecName: "inventory") pod "c3cbdd11-d93f-4025-9c08-7530a68f6113" (UID: "c3cbdd11-d93f-4025-9c08-7530a68f6113"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.738290 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "c3cbdd11-d93f-4025-9c08-7530a68f6113" (UID: "c3cbdd11-d93f-4025-9c08-7530a68f6113"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.812148 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh9wf\" (UniqueName: \"kubernetes.io/projected/c3cbdd11-d93f-4025-9c08-7530a68f6113-kube-api-access-hh9wf\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.812203 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.812214 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.812223 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:05 crc kubenswrapper[4795]: I0219 23:22:05.812234 4795 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3cbdd11-d93f-4025-9c08-7530a68f6113-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.067013 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" event={"ID":"c3cbdd11-d93f-4025-9c08-7530a68f6113","Type":"ContainerDied","Data":"0c55dc0c067482da01178354ee4aefa7baf05a359c6d442c03755a2406d2ea95"} Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.067056 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c55dc0c067482da01178354ee4aefa7baf05a359c6d442c03755a2406d2ea95" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.067088 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.922841 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8dlgt"] Feb 19 23:22:06 crc kubenswrapper[4795]: E0219 23:22:06.923627 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerName="extract-utilities" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.923641 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerName="extract-utilities" Feb 19 23:22:06 crc kubenswrapper[4795]: E0219 23:22:06.923673 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3cbdd11-d93f-4025-9c08-7530a68f6113" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.923681 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3cbdd11-d93f-4025-9c08-7530a68f6113" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 19 23:22:06 crc kubenswrapper[4795]: E0219 23:22:06.923698 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerName="registry-server" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.923704 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerName="registry-server" Feb 19 23:22:06 crc kubenswrapper[4795]: E0219 23:22:06.923731 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerName="extract-content" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.923736 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerName="extract-content" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.923933 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3cbdd11-d93f-4025-9c08-7530a68f6113" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.923953 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="27d6fc33-0869-4db3-8e1d-ce352d33d9cb" containerName="registry-server" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.925492 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:06 crc kubenswrapper[4795]: I0219 23:22:06.938021 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8dlgt"] Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.033823 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-catalog-content\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.033899 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-457r5\" (UniqueName: \"kubernetes.io/projected/c8467b6c-6941-4df2-b652-61b4c8eee22e-kube-api-access-457r5\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.033926 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-utilities\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.136236 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-catalog-content\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.136300 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-457r5\" (UniqueName: \"kubernetes.io/projected/c8467b6c-6941-4df2-b652-61b4c8eee22e-kube-api-access-457r5\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.136325 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-utilities\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.137291 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-utilities\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.137297 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-catalog-content\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.156885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-457r5\" (UniqueName: \"kubernetes.io/projected/c8467b6c-6941-4df2-b652-61b4c8eee22e-kube-api-access-457r5\") pod \"certified-operators-8dlgt\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.263220 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:07 crc kubenswrapper[4795]: I0219 23:22:07.841515 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8dlgt"] Feb 19 23:22:08 crc kubenswrapper[4795]: I0219 23:22:08.086485 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerID="7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef" exitCode=0 Feb 19 23:22:08 crc kubenswrapper[4795]: I0219 23:22:08.086535 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dlgt" event={"ID":"c8467b6c-6941-4df2-b652-61b4c8eee22e","Type":"ContainerDied","Data":"7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef"} Feb 19 23:22:08 crc kubenswrapper[4795]: I0219 23:22:08.086782 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dlgt" event={"ID":"c8467b6c-6941-4df2-b652-61b4c8eee22e","Type":"ContainerStarted","Data":"7f04f25cf122644f89f7e4477044089ac044f21f8955c90051f04532f6b7ae72"} Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.320236 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xhm6c"] Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.324291 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.370978 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xhm6c"] Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.493693 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42bqd\" (UniqueName: \"kubernetes.io/projected/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-kube-api-access-42bqd\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.493810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-catalog-content\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.493959 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-utilities\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.595602 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-utilities\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.595678 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42bqd\" (UniqueName: \"kubernetes.io/projected/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-kube-api-access-42bqd\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.595740 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-catalog-content\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.596319 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-catalog-content\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.596541 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-utilities\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.630312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42bqd\" (UniqueName: \"kubernetes.io/projected/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-kube-api-access-42bqd\") pod \"redhat-operators-xhm6c\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.650642 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:09 crc kubenswrapper[4795]: I0219 23:22:09.995811 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xhm6c"] Feb 19 23:22:10 crc kubenswrapper[4795]: W0219 23:22:10.011749 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33946ece_f847_4ce2_ab50_c4f2f61f9b4e.slice/crio-3f311eb836ae45dd9c28aa7638effd6c5fc9d4545f954e724bb6af8ab3992a1b WatchSource:0}: Error finding container 3f311eb836ae45dd9c28aa7638effd6c5fc9d4545f954e724bb6af8ab3992a1b: Status 404 returned error can't find the container with id 3f311eb836ae45dd9c28aa7638effd6c5fc9d4545f954e724bb6af8ab3992a1b Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.106916 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhm6c" event={"ID":"33946ece-f847-4ce2-ab50-c4f2f61f9b4e","Type":"ContainerStarted","Data":"3f311eb836ae45dd9c28aa7638effd6c5fc9d4545f954e724bb6af8ab3992a1b"} Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.110797 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerID="3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1" exitCode=0 Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.110837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dlgt" event={"ID":"c8467b6c-6941-4df2-b652-61b4c8eee22e","Type":"ContainerDied","Data":"3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1"} Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.175557 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-rmhfb"] Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.177596 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.180096 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.180396 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.182366 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.187260 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-rmhfb"] Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.190147 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.316827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ceph\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.317342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-inventory\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.317418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.317528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.317774 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwdtc\" (UniqueName: \"kubernetes.io/projected/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-kube-api-access-kwdtc\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.420395 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-inventory\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.420464 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.420503 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.420580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwdtc\" (UniqueName: \"kubernetes.io/projected/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-kube-api-access-kwdtc\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.420690 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ceph\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.427601 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-inventory\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.427936 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.428001 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.428042 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ceph\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.443000 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwdtc\" (UniqueName: \"kubernetes.io/projected/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-kube-api-access-kwdtc\") pod \"bootstrap-openstack-openstack-cell1-rmhfb\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:10 crc kubenswrapper[4795]: I0219 23:22:10.503715 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:22:11 crc kubenswrapper[4795]: I0219 23:22:11.135715 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dlgt" event={"ID":"c8467b6c-6941-4df2-b652-61b4c8eee22e","Type":"ContainerStarted","Data":"cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071"} Feb 19 23:22:11 crc kubenswrapper[4795]: I0219 23:22:11.141748 4795 generic.go:334] "Generic (PLEG): container finished" podID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerID="487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28" exitCode=0 Feb 19 23:22:11 crc kubenswrapper[4795]: I0219 23:22:11.141811 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhm6c" event={"ID":"33946ece-f847-4ce2-ab50-c4f2f61f9b4e","Type":"ContainerDied","Data":"487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28"} Feb 19 23:22:11 crc kubenswrapper[4795]: I0219 23:22:11.164463 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-rmhfb"] Feb 19 23:22:11 crc kubenswrapper[4795]: I0219 23:22:11.168219 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8dlgt" podStartSLOduration=2.716172142 podStartE2EDuration="5.16820106s" podCreationTimestamp="2026-02-19 23:22:06 +0000 UTC" firstStartedPulling="2026-02-19 23:22:08.088500301 +0000 UTC m=+6839.281018165" lastFinishedPulling="2026-02-19 23:22:10.540529219 +0000 UTC m=+6841.733047083" observedRunningTime="2026-02-19 23:22:11.15645554 +0000 UTC m=+6842.348973394" watchObservedRunningTime="2026-02-19 23:22:11.16820106 +0000 UTC m=+6842.360718924" Feb 19 23:22:12 crc kubenswrapper[4795]: I0219 23:22:12.152051 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" event={"ID":"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa","Type":"ContainerStarted","Data":"94c7431ebacbd1806e6fe318625607d7ac4c8655988210ced64888f48d2153e5"} Feb 19 23:22:12 crc kubenswrapper[4795]: I0219 23:22:12.152409 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" event={"ID":"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa","Type":"ContainerStarted","Data":"8334d40c98b3c6b685c906be16789e5a35ea56989f4f2c9374c39b35dba237a9"} Feb 19 23:22:12 crc kubenswrapper[4795]: I0219 23:22:12.156330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhm6c" event={"ID":"33946ece-f847-4ce2-ab50-c4f2f61f9b4e","Type":"ContainerStarted","Data":"b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4"} Feb 19 23:22:12 crc kubenswrapper[4795]: I0219 23:22:12.174125 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" podStartSLOduration=1.647355608 podStartE2EDuration="2.174106467s" podCreationTimestamp="2026-02-19 23:22:10 +0000 UTC" firstStartedPulling="2026-02-19 23:22:11.17630256 +0000 UTC m=+6842.368820414" lastFinishedPulling="2026-02-19 23:22:11.703053409 +0000 UTC m=+6842.895571273" observedRunningTime="2026-02-19 23:22:12.166194252 +0000 UTC m=+6843.358712116" watchObservedRunningTime="2026-02-19 23:22:12.174106467 +0000 UTC m=+6843.366624331" Feb 19 23:22:17 crc kubenswrapper[4795]: I0219 23:22:17.205597 4795 generic.go:334] "Generic (PLEG): container finished" podID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerID="b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4" exitCode=0 Feb 19 23:22:17 crc kubenswrapper[4795]: I0219 23:22:17.207325 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhm6c" event={"ID":"33946ece-f847-4ce2-ab50-c4f2f61f9b4e","Type":"ContainerDied","Data":"b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4"} Feb 19 23:22:17 crc kubenswrapper[4795]: I0219 23:22:17.263371 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:17 crc kubenswrapper[4795]: I0219 23:22:17.263425 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:17 crc kubenswrapper[4795]: I0219 23:22:17.313125 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:18 crc kubenswrapper[4795]: I0219 23:22:18.218385 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhm6c" event={"ID":"33946ece-f847-4ce2-ab50-c4f2f61f9b4e","Type":"ContainerStarted","Data":"a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc"} Feb 19 23:22:18 crc kubenswrapper[4795]: I0219 23:22:18.250151 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xhm6c" podStartSLOduration=2.67220479 podStartE2EDuration="9.250127187s" podCreationTimestamp="2026-02-19 23:22:09 +0000 UTC" firstStartedPulling="2026-02-19 23:22:11.143891787 +0000 UTC m=+6842.336409651" lastFinishedPulling="2026-02-19 23:22:17.721814184 +0000 UTC m=+6848.914332048" observedRunningTime="2026-02-19 23:22:18.239048083 +0000 UTC m=+6849.431565957" watchObservedRunningTime="2026-02-19 23:22:18.250127187 +0000 UTC m=+6849.442645051" Feb 19 23:22:18 crc kubenswrapper[4795]: I0219 23:22:18.275475 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:19 crc kubenswrapper[4795]: I0219 23:22:19.650852 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:19 crc kubenswrapper[4795]: I0219 23:22:19.651160 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.312217 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8dlgt"] Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.312796 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8dlgt" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerName="registry-server" containerID="cri-o://cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071" gracePeriod=2 Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.704702 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xhm6c" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="registry-server" probeResult="failure" output=< Feb 19 23:22:20 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:22:20 crc kubenswrapper[4795]: > Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.837155 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.931235 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-457r5\" (UniqueName: \"kubernetes.io/projected/c8467b6c-6941-4df2-b652-61b4c8eee22e-kube-api-access-457r5\") pod \"c8467b6c-6941-4df2-b652-61b4c8eee22e\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.931384 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-catalog-content\") pod \"c8467b6c-6941-4df2-b652-61b4c8eee22e\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.931526 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-utilities\") pod \"c8467b6c-6941-4df2-b652-61b4c8eee22e\" (UID: \"c8467b6c-6941-4df2-b652-61b4c8eee22e\") " Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.932758 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-utilities" (OuterVolumeSpecName: "utilities") pod "c8467b6c-6941-4df2-b652-61b4c8eee22e" (UID: "c8467b6c-6941-4df2-b652-61b4c8eee22e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.937128 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8467b6c-6941-4df2-b652-61b4c8eee22e-kube-api-access-457r5" (OuterVolumeSpecName: "kube-api-access-457r5") pod "c8467b6c-6941-4df2-b652-61b4c8eee22e" (UID: "c8467b6c-6941-4df2-b652-61b4c8eee22e"). InnerVolumeSpecName "kube-api-access-457r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:22:20 crc kubenswrapper[4795]: I0219 23:22:20.976915 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8467b6c-6941-4df2-b652-61b4c8eee22e" (UID: "c8467b6c-6941-4df2-b652-61b4c8eee22e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.034105 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.034139 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-457r5\" (UniqueName: \"kubernetes.io/projected/c8467b6c-6941-4df2-b652-61b4c8eee22e-kube-api-access-457r5\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.034149 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8467b6c-6941-4df2-b652-61b4c8eee22e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.272236 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerID="cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071" exitCode=0 Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.272278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dlgt" event={"ID":"c8467b6c-6941-4df2-b652-61b4c8eee22e","Type":"ContainerDied","Data":"cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071"} Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.272304 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8dlgt" event={"ID":"c8467b6c-6941-4df2-b652-61b4c8eee22e","Type":"ContainerDied","Data":"7f04f25cf122644f89f7e4477044089ac044f21f8955c90051f04532f6b7ae72"} Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.272306 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8dlgt" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.272322 4795 scope.go:117] "RemoveContainer" containerID="cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.290899 4795 scope.go:117] "RemoveContainer" containerID="3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.310527 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8dlgt"] Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.313104 4795 scope.go:117] "RemoveContainer" containerID="7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.324364 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8dlgt"] Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.364575 4795 scope.go:117] "RemoveContainer" containerID="cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071" Feb 19 23:22:21 crc kubenswrapper[4795]: E0219 23:22:21.365106 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071\": container with ID starting with cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071 not found: ID does not exist" containerID="cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.365159 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071"} err="failed to get container status \"cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071\": rpc error: code = NotFound desc = could not find container \"cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071\": container with ID starting with cd8f791e15dd3e7c0787cc203addbd2ace7aa3c8d53487ae15d5e2afca341071 not found: ID does not exist" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.365208 4795 scope.go:117] "RemoveContainer" containerID="3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1" Feb 19 23:22:21 crc kubenswrapper[4795]: E0219 23:22:21.365589 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1\": container with ID starting with 3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1 not found: ID does not exist" containerID="3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.365639 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1"} err="failed to get container status \"3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1\": rpc error: code = NotFound desc = could not find container \"3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1\": container with ID starting with 3f35772a044ff24e0c28484b8544121dea1a949261e4bc5de787469ca663a5f1 not found: ID does not exist" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.365671 4795 scope.go:117] "RemoveContainer" containerID="7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef" Feb 19 23:22:21 crc kubenswrapper[4795]: E0219 23:22:21.365955 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef\": container with ID starting with 7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef not found: ID does not exist" containerID="7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.365993 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef"} err="failed to get container status \"7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef\": rpc error: code = NotFound desc = could not find container \"7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef\": container with ID starting with 7d4060c75eec61b2f3b0d6ea6e6f1779ef5db1e64d9724c3a5006f0393287cef not found: ID does not exist" Feb 19 23:22:21 crc kubenswrapper[4795]: I0219 23:22:21.527095 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" path="/var/lib/kubelet/pods/c8467b6c-6941-4df2-b652-61b4c8eee22e/volumes" Feb 19 23:22:29 crc kubenswrapper[4795]: I0219 23:22:29.710804 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:29 crc kubenswrapper[4795]: I0219 23:22:29.759409 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:30 crc kubenswrapper[4795]: I0219 23:22:30.830814 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xhm6c"] Feb 19 23:22:31 crc kubenswrapper[4795]: I0219 23:22:31.391503 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xhm6c" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="registry-server" containerID="cri-o://a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc" gracePeriod=2 Feb 19 23:22:31 crc kubenswrapper[4795]: I0219 23:22:31.823822 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:31 crc kubenswrapper[4795]: I0219 23:22:31.900349 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-catalog-content\") pod \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " Feb 19 23:22:31 crc kubenswrapper[4795]: I0219 23:22:31.900725 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42bqd\" (UniqueName: \"kubernetes.io/projected/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-kube-api-access-42bqd\") pod \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " Feb 19 23:22:31 crc kubenswrapper[4795]: I0219 23:22:31.901041 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-utilities\") pod \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\" (UID: \"33946ece-f847-4ce2-ab50-c4f2f61f9b4e\") " Feb 19 23:22:31 crc kubenswrapper[4795]: I0219 23:22:31.902210 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-utilities" (OuterVolumeSpecName: "utilities") pod "33946ece-f847-4ce2-ab50-c4f2f61f9b4e" (UID: "33946ece-f847-4ce2-ab50-c4f2f61f9b4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:22:31 crc kubenswrapper[4795]: I0219 23:22:31.906390 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-kube-api-access-42bqd" (OuterVolumeSpecName: "kube-api-access-42bqd") pod "33946ece-f847-4ce2-ab50-c4f2f61f9b4e" (UID: "33946ece-f847-4ce2-ab50-c4f2f61f9b4e"). InnerVolumeSpecName "kube-api-access-42bqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.003929 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42bqd\" (UniqueName: \"kubernetes.io/projected/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-kube-api-access-42bqd\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.003999 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.048045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33946ece-f847-4ce2-ab50-c4f2f61f9b4e" (UID: "33946ece-f847-4ce2-ab50-c4f2f61f9b4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.107277 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33946ece-f847-4ce2-ab50-c4f2f61f9b4e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.406011 4795 generic.go:334] "Generic (PLEG): container finished" podID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerID="a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc" exitCode=0 Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.406065 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhm6c" event={"ID":"33946ece-f847-4ce2-ab50-c4f2f61f9b4e","Type":"ContainerDied","Data":"a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc"} Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.406097 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xhm6c" event={"ID":"33946ece-f847-4ce2-ab50-c4f2f61f9b4e","Type":"ContainerDied","Data":"3f311eb836ae45dd9c28aa7638effd6c5fc9d4545f954e724bb6af8ab3992a1b"} Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.406110 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xhm6c" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.406119 4795 scope.go:117] "RemoveContainer" containerID="a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.439091 4795 scope.go:117] "RemoveContainer" containerID="b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.470301 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xhm6c"] Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.480589 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xhm6c"] Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.489421 4795 scope.go:117] "RemoveContainer" containerID="487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.536051 4795 scope.go:117] "RemoveContainer" containerID="a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc" Feb 19 23:22:32 crc kubenswrapper[4795]: E0219 23:22:32.536516 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc\": container with ID starting with a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc not found: ID does not exist" containerID="a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.536625 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc"} err="failed to get container status \"a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc\": rpc error: code = NotFound desc = could not find container \"a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc\": container with ID starting with a9b8825ded8e0e381b677dccc80a6f9b71c79547183cb194af2bb32d4f2ef2fc not found: ID does not exist" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.536718 4795 scope.go:117] "RemoveContainer" containerID="b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4" Feb 19 23:22:32 crc kubenswrapper[4795]: E0219 23:22:32.537088 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4\": container with ID starting with b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4 not found: ID does not exist" containerID="b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.537140 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4"} err="failed to get container status \"b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4\": rpc error: code = NotFound desc = could not find container \"b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4\": container with ID starting with b72ba919511616616308f9bb0b6c710d4ba3184033ebc6fa828ee43fa86036d4 not found: ID does not exist" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.537210 4795 scope.go:117] "RemoveContainer" containerID="487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28" Feb 19 23:22:32 crc kubenswrapper[4795]: E0219 23:22:32.537556 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28\": container with ID starting with 487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28 not found: ID does not exist" containerID="487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28" Feb 19 23:22:32 crc kubenswrapper[4795]: I0219 23:22:32.537590 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28"} err="failed to get container status \"487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28\": rpc error: code = NotFound desc = could not find container \"487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28\": container with ID starting with 487ce0a487777574543b5bb91827dc66d9dcd5cfb6feea2ce24ffe5619111c28 not found: ID does not exist" Feb 19 23:22:33 crc kubenswrapper[4795]: I0219 23:22:33.529128 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" path="/var/lib/kubelet/pods/33946ece-f847-4ce2-ab50-c4f2f61f9b4e/volumes" Feb 19 23:23:58 crc kubenswrapper[4795]: I0219 23:23:58.427109 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:23:58 crc kubenswrapper[4795]: I0219 23:23:58.427832 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:24:28 crc kubenswrapper[4795]: I0219 23:24:28.427945 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:24:28 crc kubenswrapper[4795]: I0219 23:24:28.428580 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.427295 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.427733 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.427777 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.428553 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.428605 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" gracePeriod=600 Feb 19 23:24:58 crc kubenswrapper[4795]: E0219 23:24:58.547524 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.787452 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" exitCode=0 Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.787563 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02"} Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.787918 4795 scope.go:117] "RemoveContainer" containerID="ad9ce3f3f0f6a8b1730c68641f9c3a9fd43cad22ef4d316d4a9d380b0b12e9e5" Feb 19 23:24:58 crc kubenswrapper[4795]: I0219 23:24:58.788632 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:24:58 crc kubenswrapper[4795]: E0219 23:24:58.789054 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:25:09 crc kubenswrapper[4795]: I0219 23:25:09.530460 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:25:09 crc kubenswrapper[4795]: E0219 23:25:09.533541 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:25:14 crc kubenswrapper[4795]: I0219 23:25:14.959065 4795 generic.go:334] "Generic (PLEG): container finished" podID="a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" containerID="94c7431ebacbd1806e6fe318625607d7ac4c8655988210ced64888f48d2153e5" exitCode=0 Feb 19 23:25:14 crc kubenswrapper[4795]: I0219 23:25:14.959237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" event={"ID":"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa","Type":"ContainerDied","Data":"94c7431ebacbd1806e6fe318625607d7ac4c8655988210ced64888f48d2153e5"} Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.465736 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.504714 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-bootstrap-combined-ca-bundle\") pod \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.504788 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ceph\") pod \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.504922 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ssh-key-openstack-cell1\") pod \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.504955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwdtc\" (UniqueName: \"kubernetes.io/projected/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-kube-api-access-kwdtc\") pod \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.504975 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-inventory\") pod \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\" (UID: \"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa\") " Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.513049 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" (UID: "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.514392 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ceph" (OuterVolumeSpecName: "ceph") pod "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" (UID: "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.516182 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-kube-api-access-kwdtc" (OuterVolumeSpecName: "kube-api-access-kwdtc") pod "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" (UID: "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa"). InnerVolumeSpecName "kube-api-access-kwdtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.540464 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" (UID: "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.550324 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-inventory" (OuterVolumeSpecName: "inventory") pod "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" (UID: "a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.607091 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.607132 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwdtc\" (UniqueName: \"kubernetes.io/projected/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-kube-api-access-kwdtc\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.607144 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.607154 4795 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.607181 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.983125 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" event={"ID":"a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa","Type":"ContainerDied","Data":"8334d40c98b3c6b685c906be16789e5a35ea56989f4f2c9374c39b35dba237a9"} Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.983203 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8334d40c98b3c6b685c906be16789e5a35ea56989f4f2c9374c39b35dba237a9" Feb 19 23:25:16 crc kubenswrapper[4795]: I0219 23:25:16.983210 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-rmhfb" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.088923 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-wfcx4"] Feb 19 23:25:17 crc kubenswrapper[4795]: E0219 23:25:17.089455 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerName="extract-content" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089478 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerName="extract-content" Feb 19 23:25:17 crc kubenswrapper[4795]: E0219 23:25:17.089507 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="extract-content" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089517 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="extract-content" Feb 19 23:25:17 crc kubenswrapper[4795]: E0219 23:25:17.089540 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" containerName="bootstrap-openstack-openstack-cell1" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089548 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" containerName="bootstrap-openstack-openstack-cell1" Feb 19 23:25:17 crc kubenswrapper[4795]: E0219 23:25:17.089579 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="extract-utilities" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089587 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="extract-utilities" Feb 19 23:25:17 crc kubenswrapper[4795]: E0219 23:25:17.089600 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="registry-server" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089608 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="registry-server" Feb 19 23:25:17 crc kubenswrapper[4795]: E0219 23:25:17.089620 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerName="extract-utilities" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089628 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerName="extract-utilities" Feb 19 23:25:17 crc kubenswrapper[4795]: E0219 23:25:17.089644 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerName="registry-server" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089653 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerName="registry-server" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089877 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa" containerName="bootstrap-openstack-openstack-cell1" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089904 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="33946ece-f847-4ce2-ab50-c4f2f61f9b4e" containerName="registry-server" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.089923 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8467b6c-6941-4df2-b652-61b4c8eee22e" containerName="registry-server" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.092295 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.099052 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.100709 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.101004 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.101292 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.109461 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-wfcx4"] Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.123929 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ceph\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.124072 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-inventory\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.124434 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.124561 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z879\" (UniqueName: \"kubernetes.io/projected/69464400-c61c-41bd-aeeb-984f7f948a16-kube-api-access-8z879\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.225653 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ceph\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.225707 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-inventory\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.225766 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.225821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z879\" (UniqueName: \"kubernetes.io/projected/69464400-c61c-41bd-aeeb-984f7f948a16-kube-api-access-8z879\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.229358 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.229358 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-inventory\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.230969 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ceph\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.242577 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z879\" (UniqueName: \"kubernetes.io/projected/69464400-c61c-41bd-aeeb-984f7f948a16-kube-api-access-8z879\") pod \"download-cache-openstack-openstack-cell1-wfcx4\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.426475 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.983398 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-wfcx4"] Feb 19 23:25:17 crc kubenswrapper[4795]: I0219 23:25:17.998230 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" event={"ID":"69464400-c61c-41bd-aeeb-984f7f948a16","Type":"ContainerStarted","Data":"5a15f244c3f5001d78c7272f8f15efa1a2a5ab014c4d21f43ab37cd73349c5a7"} Feb 19 23:25:20 crc kubenswrapper[4795]: I0219 23:25:20.027285 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" event={"ID":"69464400-c61c-41bd-aeeb-984f7f948a16","Type":"ContainerStarted","Data":"61f2e5abf146e1a51e958bfaea16801ce6ea6794f6232235430acd11508c38dc"} Feb 19 23:25:23 crc kubenswrapper[4795]: I0219 23:25:23.513451 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:25:23 crc kubenswrapper[4795]: E0219 23:25:23.514348 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:25:34 crc kubenswrapper[4795]: I0219 23:25:34.512135 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:25:34 crc kubenswrapper[4795]: E0219 23:25:34.512890 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.256800 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" podStartSLOduration=25.366609336 podStartE2EDuration="26.256777388s" podCreationTimestamp="2026-02-19 23:25:17 +0000 UTC" firstStartedPulling="2026-02-19 23:25:17.984945652 +0000 UTC m=+7029.177463516" lastFinishedPulling="2026-02-19 23:25:18.875113654 +0000 UTC m=+7030.067631568" observedRunningTime="2026-02-19 23:25:20.043845502 +0000 UTC m=+7031.236363396" watchObservedRunningTime="2026-02-19 23:25:43.256777388 +0000 UTC m=+7054.449295252" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.260756 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j2f42"] Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.263620 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.280397 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2f42"] Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.403038 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-catalog-content\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.404235 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-utilities\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.404346 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twq4h\" (UniqueName: \"kubernetes.io/projected/3c5b2e93-bfd6-409c-955a-78f80b984a11-kube-api-access-twq4h\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.506192 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-utilities\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.506259 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twq4h\" (UniqueName: \"kubernetes.io/projected/3c5b2e93-bfd6-409c-955a-78f80b984a11-kube-api-access-twq4h\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.506330 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-catalog-content\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.506703 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-utilities\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.506857 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-catalog-content\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.531135 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twq4h\" (UniqueName: \"kubernetes.io/projected/3c5b2e93-bfd6-409c-955a-78f80b984a11-kube-api-access-twq4h\") pod \"redhat-marketplace-j2f42\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:43 crc kubenswrapper[4795]: I0219 23:25:43.606474 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:44 crc kubenswrapper[4795]: I0219 23:25:44.094341 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2f42"] Feb 19 23:25:44 crc kubenswrapper[4795]: I0219 23:25:44.264109 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2f42" event={"ID":"3c5b2e93-bfd6-409c-955a-78f80b984a11","Type":"ContainerStarted","Data":"58a1348a548eed875beb7501c5f4e5a8e1b842ab2941eb12d2da73920b895673"} Feb 19 23:25:45 crc kubenswrapper[4795]: I0219 23:25:45.277441 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerID="acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c" exitCode=0 Feb 19 23:25:45 crc kubenswrapper[4795]: I0219 23:25:45.277619 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2f42" event={"ID":"3c5b2e93-bfd6-409c-955a-78f80b984a11","Type":"ContainerDied","Data":"acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c"} Feb 19 23:25:45 crc kubenswrapper[4795]: I0219 23:25:45.280859 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:25:47 crc kubenswrapper[4795]: I0219 23:25:47.512724 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:25:47 crc kubenswrapper[4795]: E0219 23:25:47.513518 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:25:48 crc kubenswrapper[4795]: I0219 23:25:48.324178 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerID="57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f" exitCode=0 Feb 19 23:25:48 crc kubenswrapper[4795]: I0219 23:25:48.324275 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2f42" event={"ID":"3c5b2e93-bfd6-409c-955a-78f80b984a11","Type":"ContainerDied","Data":"57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f"} Feb 19 23:25:49 crc kubenswrapper[4795]: I0219 23:25:49.334914 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2f42" event={"ID":"3c5b2e93-bfd6-409c-955a-78f80b984a11","Type":"ContainerStarted","Data":"19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64"} Feb 19 23:25:49 crc kubenswrapper[4795]: I0219 23:25:49.361270 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j2f42" podStartSLOduration=2.92788686 podStartE2EDuration="6.361251095s" podCreationTimestamp="2026-02-19 23:25:43 +0000 UTC" firstStartedPulling="2026-02-19 23:25:45.280480588 +0000 UTC m=+7056.472998462" lastFinishedPulling="2026-02-19 23:25:48.713844833 +0000 UTC m=+7059.906362697" observedRunningTime="2026-02-19 23:25:49.35261218 +0000 UTC m=+7060.545130044" watchObservedRunningTime="2026-02-19 23:25:49.361251095 +0000 UTC m=+7060.553768959" Feb 19 23:25:53 crc kubenswrapper[4795]: I0219 23:25:53.607114 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:53 crc kubenswrapper[4795]: I0219 23:25:53.607749 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:53 crc kubenswrapper[4795]: I0219 23:25:53.650102 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:54 crc kubenswrapper[4795]: I0219 23:25:54.444026 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:54 crc kubenswrapper[4795]: I0219 23:25:54.499040 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2f42"] Feb 19 23:25:56 crc kubenswrapper[4795]: I0219 23:25:56.410002 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j2f42" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerName="registry-server" containerID="cri-o://19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64" gracePeriod=2 Feb 19 23:25:56 crc kubenswrapper[4795]: I0219 23:25:56.912650 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.001479 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-catalog-content\") pod \"3c5b2e93-bfd6-409c-955a-78f80b984a11\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.001592 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-utilities\") pod \"3c5b2e93-bfd6-409c-955a-78f80b984a11\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.001713 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twq4h\" (UniqueName: \"kubernetes.io/projected/3c5b2e93-bfd6-409c-955a-78f80b984a11-kube-api-access-twq4h\") pod \"3c5b2e93-bfd6-409c-955a-78f80b984a11\" (UID: \"3c5b2e93-bfd6-409c-955a-78f80b984a11\") " Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.002595 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-utilities" (OuterVolumeSpecName: "utilities") pod "3c5b2e93-bfd6-409c-955a-78f80b984a11" (UID: "3c5b2e93-bfd6-409c-955a-78f80b984a11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.007348 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c5b2e93-bfd6-409c-955a-78f80b984a11-kube-api-access-twq4h" (OuterVolumeSpecName: "kube-api-access-twq4h") pod "3c5b2e93-bfd6-409c-955a-78f80b984a11" (UID: "3c5b2e93-bfd6-409c-955a-78f80b984a11"). InnerVolumeSpecName "kube-api-access-twq4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.036840 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c5b2e93-bfd6-409c-955a-78f80b984a11" (UID: "3c5b2e93-bfd6-409c-955a-78f80b984a11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.104264 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twq4h\" (UniqueName: \"kubernetes.io/projected/3c5b2e93-bfd6-409c-955a-78f80b984a11-kube-api-access-twq4h\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.104291 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.104300 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5b2e93-bfd6-409c-955a-78f80b984a11-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.427256 4795 generic.go:334] "Generic (PLEG): container finished" podID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerID="19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64" exitCode=0 Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.427361 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2f42" event={"ID":"3c5b2e93-bfd6-409c-955a-78f80b984a11","Type":"ContainerDied","Data":"19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64"} Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.427431 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2f42" event={"ID":"3c5b2e93-bfd6-409c-955a-78f80b984a11","Type":"ContainerDied","Data":"58a1348a548eed875beb7501c5f4e5a8e1b842ab2941eb12d2da73920b895673"} Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.427471 4795 scope.go:117] "RemoveContainer" containerID="19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.428367 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2f42" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.474623 4795 scope.go:117] "RemoveContainer" containerID="57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.478082 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2f42"] Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.492557 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2f42"] Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.499652 4795 scope.go:117] "RemoveContainer" containerID="acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.529660 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" path="/var/lib/kubelet/pods/3c5b2e93-bfd6-409c-955a-78f80b984a11/volumes" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.573839 4795 scope.go:117] "RemoveContainer" containerID="19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64" Feb 19 23:25:57 crc kubenswrapper[4795]: E0219 23:25:57.574402 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64\": container with ID starting with 19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64 not found: ID does not exist" containerID="19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.574439 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64"} err="failed to get container status \"19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64\": rpc error: code = NotFound desc = could not find container \"19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64\": container with ID starting with 19d25ff8c015afdd2628f4228b1fdb66a0580f86620d6c600e7ba30f2dbdff64 not found: ID does not exist" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.574463 4795 scope.go:117] "RemoveContainer" containerID="57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f" Feb 19 23:25:57 crc kubenswrapper[4795]: E0219 23:25:57.575103 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f\": container with ID starting with 57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f not found: ID does not exist" containerID="57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.575156 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f"} err="failed to get container status \"57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f\": rpc error: code = NotFound desc = could not find container \"57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f\": container with ID starting with 57273199fd35f22e63681804ce55aacdbac84c17c7b8c77009db7d1d3c52905f not found: ID does not exist" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.575224 4795 scope.go:117] "RemoveContainer" containerID="acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c" Feb 19 23:25:57 crc kubenswrapper[4795]: E0219 23:25:57.575705 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c\": container with ID starting with acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c not found: ID does not exist" containerID="acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c" Feb 19 23:25:57 crc kubenswrapper[4795]: I0219 23:25:57.575743 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c"} err="failed to get container status \"acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c\": rpc error: code = NotFound desc = could not find container \"acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c\": container with ID starting with acea590c649b76bec688ad5b2514c2f2abc3c5067f3ac85d69ba16fac86b259c not found: ID does not exist" Feb 19 23:26:00 crc kubenswrapper[4795]: I0219 23:26:00.512565 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:26:00 crc kubenswrapper[4795]: E0219 23:26:00.513588 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:26:15 crc kubenswrapper[4795]: I0219 23:26:15.512114 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:26:15 crc kubenswrapper[4795]: E0219 23:26:15.513030 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:26:26 crc kubenswrapper[4795]: I0219 23:26:26.512490 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:26:26 crc kubenswrapper[4795]: E0219 23:26:26.513238 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:26:41 crc kubenswrapper[4795]: I0219 23:26:41.512373 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:26:41 crc kubenswrapper[4795]: E0219 23:26:41.513782 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:26:55 crc kubenswrapper[4795]: I0219 23:26:55.512438 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:26:55 crc kubenswrapper[4795]: E0219 23:26:55.513416 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:27:07 crc kubenswrapper[4795]: I0219 23:27:07.512481 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:27:07 crc kubenswrapper[4795]: E0219 23:27:07.513302 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:27:09 crc kubenswrapper[4795]: I0219 23:27:09.172915 4795 generic.go:334] "Generic (PLEG): container finished" podID="69464400-c61c-41bd-aeeb-984f7f948a16" containerID="61f2e5abf146e1a51e958bfaea16801ce6ea6794f6232235430acd11508c38dc" exitCode=0 Feb 19 23:27:09 crc kubenswrapper[4795]: I0219 23:27:09.172994 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" event={"ID":"69464400-c61c-41bd-aeeb-984f7f948a16","Type":"ContainerDied","Data":"61f2e5abf146e1a51e958bfaea16801ce6ea6794f6232235430acd11508c38dc"} Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.707557 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.755616 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z879\" (UniqueName: \"kubernetes.io/projected/69464400-c61c-41bd-aeeb-984f7f948a16-kube-api-access-8z879\") pod \"69464400-c61c-41bd-aeeb-984f7f948a16\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.755788 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ssh-key-openstack-cell1\") pod \"69464400-c61c-41bd-aeeb-984f7f948a16\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.755825 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-inventory\") pod \"69464400-c61c-41bd-aeeb-984f7f948a16\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.755969 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ceph\") pod \"69464400-c61c-41bd-aeeb-984f7f948a16\" (UID: \"69464400-c61c-41bd-aeeb-984f7f948a16\") " Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.763372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ceph" (OuterVolumeSpecName: "ceph") pod "69464400-c61c-41bd-aeeb-984f7f948a16" (UID: "69464400-c61c-41bd-aeeb-984f7f948a16"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.768368 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69464400-c61c-41bd-aeeb-984f7f948a16-kube-api-access-8z879" (OuterVolumeSpecName: "kube-api-access-8z879") pod "69464400-c61c-41bd-aeeb-984f7f948a16" (UID: "69464400-c61c-41bd-aeeb-984f7f948a16"). InnerVolumeSpecName "kube-api-access-8z879". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.787637 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "69464400-c61c-41bd-aeeb-984f7f948a16" (UID: "69464400-c61c-41bd-aeeb-984f7f948a16"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.804372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-inventory" (OuterVolumeSpecName: "inventory") pod "69464400-c61c-41bd-aeeb-984f7f948a16" (UID: "69464400-c61c-41bd-aeeb-984f7f948a16"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.859587 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.859626 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.859641 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69464400-c61c-41bd-aeeb-984f7f948a16-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:27:10 crc kubenswrapper[4795]: I0219 23:27:10.859651 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z879\" (UniqueName: \"kubernetes.io/projected/69464400-c61c-41bd-aeeb-984f7f948a16-kube-api-access-8z879\") on node \"crc\" DevicePath \"\"" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.192763 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" event={"ID":"69464400-c61c-41bd-aeeb-984f7f948a16","Type":"ContainerDied","Data":"5a15f244c3f5001d78c7272f8f15efa1a2a5ab014c4d21f43ab37cd73349c5a7"} Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.192813 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a15f244c3f5001d78c7272f8f15efa1a2a5ab014c4d21f43ab37cd73349c5a7" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.192826 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wfcx4" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.279444 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-2twf8"] Feb 19 23:27:11 crc kubenswrapper[4795]: E0219 23:27:11.280150 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerName="extract-content" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.280187 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerName="extract-content" Feb 19 23:27:11 crc kubenswrapper[4795]: E0219 23:27:11.280212 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerName="registry-server" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.280220 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerName="registry-server" Feb 19 23:27:11 crc kubenswrapper[4795]: E0219 23:27:11.280266 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerName="extract-utilities" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.280276 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerName="extract-utilities" Feb 19 23:27:11 crc kubenswrapper[4795]: E0219 23:27:11.280301 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69464400-c61c-41bd-aeeb-984f7f948a16" containerName="download-cache-openstack-openstack-cell1" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.280308 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="69464400-c61c-41bd-aeeb-984f7f948a16" containerName="download-cache-openstack-openstack-cell1" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.280530 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="69464400-c61c-41bd-aeeb-984f7f948a16" containerName="download-cache-openstack-openstack-cell1" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.280547 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c5b2e93-bfd6-409c-955a-78f80b984a11" containerName="registry-server" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.281683 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.284040 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.284315 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.287222 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.288130 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.291284 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-2twf8"] Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.369049 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b628n\" (UniqueName: \"kubernetes.io/projected/5a293bce-3326-47c0-a9b5-b5af13dc46c8-kube-api-access-b628n\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.369257 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ceph\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.369402 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.369641 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-inventory\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.473247 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b628n\" (UniqueName: \"kubernetes.io/projected/5a293bce-3326-47c0-a9b5-b5af13dc46c8-kube-api-access-b628n\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.473387 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ceph\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.473453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.473551 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-inventory\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.478570 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ceph\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.479345 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.480281 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-inventory\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.490209 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b628n\" (UniqueName: \"kubernetes.io/projected/5a293bce-3326-47c0-a9b5-b5af13dc46c8-kube-api-access-b628n\") pod \"configure-network-openstack-openstack-cell1-2twf8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:11 crc kubenswrapper[4795]: I0219 23:27:11.598889 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:27:12 crc kubenswrapper[4795]: I0219 23:27:12.184997 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-2twf8"] Feb 19 23:27:12 crc kubenswrapper[4795]: I0219 23:27:12.205707 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" event={"ID":"5a293bce-3326-47c0-a9b5-b5af13dc46c8","Type":"ContainerStarted","Data":"7e52b4553606d0655b60660a80810958f26b5efc3f788860961635ae7f9bd444"} Feb 19 23:27:13 crc kubenswrapper[4795]: I0219 23:27:13.216897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" event={"ID":"5a293bce-3326-47c0-a9b5-b5af13dc46c8","Type":"ContainerStarted","Data":"359a513d338aa3470a639a702c3ec34b9225f00bb77c08a2396abde50063710a"} Feb 19 23:27:13 crc kubenswrapper[4795]: I0219 23:27:13.246264 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" podStartSLOduration=1.701007835 podStartE2EDuration="2.246218848s" podCreationTimestamp="2026-02-19 23:27:11 +0000 UTC" firstStartedPulling="2026-02-19 23:27:12.194785259 +0000 UTC m=+7143.387303123" lastFinishedPulling="2026-02-19 23:27:12.739996262 +0000 UTC m=+7143.932514136" observedRunningTime="2026-02-19 23:27:13.237144591 +0000 UTC m=+7144.429662505" watchObservedRunningTime="2026-02-19 23:27:13.246218848 +0000 UTC m=+7144.438736722" Feb 19 23:27:22 crc kubenswrapper[4795]: I0219 23:27:22.512716 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:27:22 crc kubenswrapper[4795]: E0219 23:27:22.513900 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:27:34 crc kubenswrapper[4795]: I0219 23:27:34.525797 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:27:34 crc kubenswrapper[4795]: E0219 23:27:34.526801 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:27:45 crc kubenswrapper[4795]: I0219 23:27:45.512705 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:27:45 crc kubenswrapper[4795]: E0219 23:27:45.513652 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:27:59 crc kubenswrapper[4795]: I0219 23:27:59.520182 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:27:59 crc kubenswrapper[4795]: E0219 23:27:59.520959 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:28:14 crc kubenswrapper[4795]: I0219 23:28:14.512541 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:28:14 crc kubenswrapper[4795]: E0219 23:28:14.513626 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:28:29 crc kubenswrapper[4795]: I0219 23:28:29.517752 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:28:29 crc kubenswrapper[4795]: E0219 23:28:29.518555 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:28:40 crc kubenswrapper[4795]: I0219 23:28:40.117948 4795 generic.go:334] "Generic (PLEG): container finished" podID="5a293bce-3326-47c0-a9b5-b5af13dc46c8" containerID="359a513d338aa3470a639a702c3ec34b9225f00bb77c08a2396abde50063710a" exitCode=0 Feb 19 23:28:40 crc kubenswrapper[4795]: I0219 23:28:40.118078 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" event={"ID":"5a293bce-3326-47c0-a9b5-b5af13dc46c8","Type":"ContainerDied","Data":"359a513d338aa3470a639a702c3ec34b9225f00bb77c08a2396abde50063710a"} Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.589322 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.661700 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-inventory\") pod \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.662225 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b628n\" (UniqueName: \"kubernetes.io/projected/5a293bce-3326-47c0-a9b5-b5af13dc46c8-kube-api-access-b628n\") pod \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.662353 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ceph\") pod \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.662405 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ssh-key-openstack-cell1\") pod \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\" (UID: \"5a293bce-3326-47c0-a9b5-b5af13dc46c8\") " Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.668808 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a293bce-3326-47c0-a9b5-b5af13dc46c8-kube-api-access-b628n" (OuterVolumeSpecName: "kube-api-access-b628n") pod "5a293bce-3326-47c0-a9b5-b5af13dc46c8" (UID: "5a293bce-3326-47c0-a9b5-b5af13dc46c8"). InnerVolumeSpecName "kube-api-access-b628n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.668910 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ceph" (OuterVolumeSpecName: "ceph") pod "5a293bce-3326-47c0-a9b5-b5af13dc46c8" (UID: "5a293bce-3326-47c0-a9b5-b5af13dc46c8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.691432 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5a293bce-3326-47c0-a9b5-b5af13dc46c8" (UID: "5a293bce-3326-47c0-a9b5-b5af13dc46c8"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.696637 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-inventory" (OuterVolumeSpecName: "inventory") pod "5a293bce-3326-47c0-a9b5-b5af13dc46c8" (UID: "5a293bce-3326-47c0-a9b5-b5af13dc46c8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.765907 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.765957 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b628n\" (UniqueName: \"kubernetes.io/projected/5a293bce-3326-47c0-a9b5-b5af13dc46c8-kube-api-access-b628n\") on node \"crc\" DevicePath \"\"" Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.765977 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:28:41 crc kubenswrapper[4795]: I0219 23:28:41.765994 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5a293bce-3326-47c0-a9b5-b5af13dc46c8-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.138743 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" event={"ID":"5a293bce-3326-47c0-a9b5-b5af13dc46c8","Type":"ContainerDied","Data":"7e52b4553606d0655b60660a80810958f26b5efc3f788860961635ae7f9bd444"} Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.138783 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e52b4553606d0655b60660a80810958f26b5efc3f788860961635ae7f9bd444" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.139567 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-2twf8" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.230393 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-44t2w"] Feb 19 23:28:42 crc kubenswrapper[4795]: E0219 23:28:42.230962 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a293bce-3326-47c0-a9b5-b5af13dc46c8" containerName="configure-network-openstack-openstack-cell1" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.230991 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a293bce-3326-47c0-a9b5-b5af13dc46c8" containerName="configure-network-openstack-openstack-cell1" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.231283 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a293bce-3326-47c0-a9b5-b5af13dc46c8" containerName="configure-network-openstack-openstack-cell1" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.232002 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.234536 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.238340 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.238646 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.242667 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.255731 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-44t2w"] Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.279857 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-inventory\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.279930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ceph\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.280084 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpgbv\" (UniqueName: \"kubernetes.io/projected/94dbf6be-911e-46d9-a950-fa19fa137490-kube-api-access-tpgbv\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.280117 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.382457 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-inventory\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.382506 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ceph\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.382597 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgbv\" (UniqueName: \"kubernetes.io/projected/94dbf6be-911e-46d9-a950-fa19fa137490-kube-api-access-tpgbv\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.382622 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.386907 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ceph\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.387498 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.393834 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-inventory\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.398285 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpgbv\" (UniqueName: \"kubernetes.io/projected/94dbf6be-911e-46d9-a950-fa19fa137490-kube-api-access-tpgbv\") pod \"validate-network-openstack-openstack-cell1-44t2w\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.513701 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:28:42 crc kubenswrapper[4795]: E0219 23:28:42.513999 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:28:42 crc kubenswrapper[4795]: I0219 23:28:42.566536 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:43 crc kubenswrapper[4795]: I0219 23:28:43.126180 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-44t2w"] Feb 19 23:28:43 crc kubenswrapper[4795]: W0219 23:28:43.136593 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94dbf6be_911e_46d9_a950_fa19fa137490.slice/crio-637d19a2b3792f44bf090f08ca44ccb5090e52c43db85942869e3627dba6ac80 WatchSource:0}: Error finding container 637d19a2b3792f44bf090f08ca44ccb5090e52c43db85942869e3627dba6ac80: Status 404 returned error can't find the container with id 637d19a2b3792f44bf090f08ca44ccb5090e52c43db85942869e3627dba6ac80 Feb 19 23:28:44 crc kubenswrapper[4795]: I0219 23:28:44.164070 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" event={"ID":"94dbf6be-911e-46d9-a950-fa19fa137490","Type":"ContainerStarted","Data":"52eb3235fcf364c95d5fa52e43247c7f4457ed2cf97218b6e6ddac7c429f7e1b"} Feb 19 23:28:44 crc kubenswrapper[4795]: I0219 23:28:44.164615 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" event={"ID":"94dbf6be-911e-46d9-a950-fa19fa137490","Type":"ContainerStarted","Data":"637d19a2b3792f44bf090f08ca44ccb5090e52c43db85942869e3627dba6ac80"} Feb 19 23:28:44 crc kubenswrapper[4795]: I0219 23:28:44.192921 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" podStartSLOduration=1.709258087 podStartE2EDuration="2.192902013s" podCreationTimestamp="2026-02-19 23:28:42 +0000 UTC" firstStartedPulling="2026-02-19 23:28:43.145141349 +0000 UTC m=+7234.337659213" lastFinishedPulling="2026-02-19 23:28:43.628785275 +0000 UTC m=+7234.821303139" observedRunningTime="2026-02-19 23:28:44.191543005 +0000 UTC m=+7235.384060869" watchObservedRunningTime="2026-02-19 23:28:44.192902013 +0000 UTC m=+7235.385419867" Feb 19 23:28:49 crc kubenswrapper[4795]: I0219 23:28:49.231477 4795 generic.go:334] "Generic (PLEG): container finished" podID="94dbf6be-911e-46d9-a950-fa19fa137490" containerID="52eb3235fcf364c95d5fa52e43247c7f4457ed2cf97218b6e6ddac7c429f7e1b" exitCode=0 Feb 19 23:28:49 crc kubenswrapper[4795]: I0219 23:28:49.231595 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" event={"ID":"94dbf6be-911e-46d9-a950-fa19fa137490","Type":"ContainerDied","Data":"52eb3235fcf364c95d5fa52e43247c7f4457ed2cf97218b6e6ddac7c429f7e1b"} Feb 19 23:28:50 crc kubenswrapper[4795]: I0219 23:28:50.887999 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:50 crc kubenswrapper[4795]: I0219 23:28:50.970495 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-inventory\") pod \"94dbf6be-911e-46d9-a950-fa19fa137490\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " Feb 19 23:28:50 crc kubenswrapper[4795]: I0219 23:28:50.970677 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ceph\") pod \"94dbf6be-911e-46d9-a950-fa19fa137490\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " Feb 19 23:28:50 crc kubenswrapper[4795]: I0219 23:28:50.970721 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ssh-key-openstack-cell1\") pod \"94dbf6be-911e-46d9-a950-fa19fa137490\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " Feb 19 23:28:50 crc kubenswrapper[4795]: I0219 23:28:50.970903 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpgbv\" (UniqueName: \"kubernetes.io/projected/94dbf6be-911e-46d9-a950-fa19fa137490-kube-api-access-tpgbv\") pod \"94dbf6be-911e-46d9-a950-fa19fa137490\" (UID: \"94dbf6be-911e-46d9-a950-fa19fa137490\") " Feb 19 23:28:50 crc kubenswrapper[4795]: I0219 23:28:50.975906 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ceph" (OuterVolumeSpecName: "ceph") pod "94dbf6be-911e-46d9-a950-fa19fa137490" (UID: "94dbf6be-911e-46d9-a950-fa19fa137490"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:28:50 crc kubenswrapper[4795]: I0219 23:28:50.976454 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94dbf6be-911e-46d9-a950-fa19fa137490-kube-api-access-tpgbv" (OuterVolumeSpecName: "kube-api-access-tpgbv") pod "94dbf6be-911e-46d9-a950-fa19fa137490" (UID: "94dbf6be-911e-46d9-a950-fa19fa137490"). InnerVolumeSpecName "kube-api-access-tpgbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.001837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-inventory" (OuterVolumeSpecName: "inventory") pod "94dbf6be-911e-46d9-a950-fa19fa137490" (UID: "94dbf6be-911e-46d9-a950-fa19fa137490"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.003749 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "94dbf6be-911e-46d9-a950-fa19fa137490" (UID: "94dbf6be-911e-46d9-a950-fa19fa137490"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.072899 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpgbv\" (UniqueName: \"kubernetes.io/projected/94dbf6be-911e-46d9-a950-fa19fa137490-kube-api-access-tpgbv\") on node \"crc\" DevicePath \"\"" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.072933 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.072943 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.072952 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/94dbf6be-911e-46d9-a950-fa19fa137490-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.254149 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" event={"ID":"94dbf6be-911e-46d9-a950-fa19fa137490","Type":"ContainerDied","Data":"637d19a2b3792f44bf090f08ca44ccb5090e52c43db85942869e3627dba6ac80"} Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.254198 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="637d19a2b3792f44bf090f08ca44ccb5090e52c43db85942869e3627dba6ac80" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.254248 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-44t2w" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.347010 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-6ctfc"] Feb 19 23:28:51 crc kubenswrapper[4795]: E0219 23:28:51.347548 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94dbf6be-911e-46d9-a950-fa19fa137490" containerName="validate-network-openstack-openstack-cell1" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.347566 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="94dbf6be-911e-46d9-a950-fa19fa137490" containerName="validate-network-openstack-openstack-cell1" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.347832 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="94dbf6be-911e-46d9-a950-fa19fa137490" containerName="validate-network-openstack-openstack-cell1" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.348915 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.353759 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.354126 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.354463 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.354680 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.369488 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-6ctfc"] Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.384981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.385441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74wxd\" (UniqueName: \"kubernetes.io/projected/41df3556-7d70-47f5-bd79-bec74fbd269c-kube-api-access-74wxd\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.385494 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-inventory\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.385568 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ceph\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.487559 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74wxd\" (UniqueName: \"kubernetes.io/projected/41df3556-7d70-47f5-bd79-bec74fbd269c-kube-api-access-74wxd\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.487614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-inventory\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.487654 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ceph\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.487733 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.491474 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.492130 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-inventory\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.492508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ceph\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.507850 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74wxd\" (UniqueName: \"kubernetes.io/projected/41df3556-7d70-47f5-bd79-bec74fbd269c-kube-api-access-74wxd\") pod \"install-os-openstack-openstack-cell1-6ctfc\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:51 crc kubenswrapper[4795]: I0219 23:28:51.676380 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:28:52 crc kubenswrapper[4795]: I0219 23:28:52.230955 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-6ctfc"] Feb 19 23:28:52 crc kubenswrapper[4795]: I0219 23:28:52.267895 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" event={"ID":"41df3556-7d70-47f5-bd79-bec74fbd269c","Type":"ContainerStarted","Data":"24420da62070a51e6a7078c855f70d8ac227fa3cb1925b98155c54146d41d372"} Feb 19 23:28:53 crc kubenswrapper[4795]: I0219 23:28:53.277036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" event={"ID":"41df3556-7d70-47f5-bd79-bec74fbd269c","Type":"ContainerStarted","Data":"32f851236caf6b2bbe224e448255cea34c1cbbfdd587e7b66125b4e5133ed565"} Feb 19 23:28:53 crc kubenswrapper[4795]: I0219 23:28:53.296545 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" podStartSLOduration=1.898797467 podStartE2EDuration="2.296526945s" podCreationTimestamp="2026-02-19 23:28:51 +0000 UTC" firstStartedPulling="2026-02-19 23:28:52.25618306 +0000 UTC m=+7243.448700924" lastFinishedPulling="2026-02-19 23:28:52.653912538 +0000 UTC m=+7243.846430402" observedRunningTime="2026-02-19 23:28:53.293332214 +0000 UTC m=+7244.485850098" watchObservedRunningTime="2026-02-19 23:28:53.296526945 +0000 UTC m=+7244.489044809" Feb 19 23:28:56 crc kubenswrapper[4795]: I0219 23:28:56.511842 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:28:56 crc kubenswrapper[4795]: E0219 23:28:56.512915 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:29:07 crc kubenswrapper[4795]: I0219 23:29:07.512604 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:29:07 crc kubenswrapper[4795]: E0219 23:29:07.513768 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:29:21 crc kubenswrapper[4795]: I0219 23:29:21.512467 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:29:21 crc kubenswrapper[4795]: E0219 23:29:21.518087 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:29:34 crc kubenswrapper[4795]: I0219 23:29:34.512236 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:29:34 crc kubenswrapper[4795]: E0219 23:29:34.512983 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:29:36 crc kubenswrapper[4795]: I0219 23:29:36.707613 4795 generic.go:334] "Generic (PLEG): container finished" podID="41df3556-7d70-47f5-bd79-bec74fbd269c" containerID="32f851236caf6b2bbe224e448255cea34c1cbbfdd587e7b66125b4e5133ed565" exitCode=0 Feb 19 23:29:36 crc kubenswrapper[4795]: I0219 23:29:36.707688 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" event={"ID":"41df3556-7d70-47f5-bd79-bec74fbd269c","Type":"ContainerDied","Data":"32f851236caf6b2bbe224e448255cea34c1cbbfdd587e7b66125b4e5133ed565"} Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.222891 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.290840 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ssh-key-openstack-cell1\") pod \"41df3556-7d70-47f5-bd79-bec74fbd269c\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.290888 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-inventory\") pod \"41df3556-7d70-47f5-bd79-bec74fbd269c\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.290996 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ceph\") pod \"41df3556-7d70-47f5-bd79-bec74fbd269c\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.291131 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74wxd\" (UniqueName: \"kubernetes.io/projected/41df3556-7d70-47f5-bd79-bec74fbd269c-kube-api-access-74wxd\") pod \"41df3556-7d70-47f5-bd79-bec74fbd269c\" (UID: \"41df3556-7d70-47f5-bd79-bec74fbd269c\") " Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.296080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41df3556-7d70-47f5-bd79-bec74fbd269c-kube-api-access-74wxd" (OuterVolumeSpecName: "kube-api-access-74wxd") pod "41df3556-7d70-47f5-bd79-bec74fbd269c" (UID: "41df3556-7d70-47f5-bd79-bec74fbd269c"). InnerVolumeSpecName "kube-api-access-74wxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.298302 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ceph" (OuterVolumeSpecName: "ceph") pod "41df3556-7d70-47f5-bd79-bec74fbd269c" (UID: "41df3556-7d70-47f5-bd79-bec74fbd269c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.320468 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-inventory" (OuterVolumeSpecName: "inventory") pod "41df3556-7d70-47f5-bd79-bec74fbd269c" (UID: "41df3556-7d70-47f5-bd79-bec74fbd269c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.320590 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "41df3556-7d70-47f5-bd79-bec74fbd269c" (UID: "41df3556-7d70-47f5-bd79-bec74fbd269c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.393376 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.393581 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.393661 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/41df3556-7d70-47f5-bd79-bec74fbd269c-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.393728 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74wxd\" (UniqueName: \"kubernetes.io/projected/41df3556-7d70-47f5-bd79-bec74fbd269c-kube-api-access-74wxd\") on node \"crc\" DevicePath \"\"" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.723889 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" event={"ID":"41df3556-7d70-47f5-bd79-bec74fbd269c","Type":"ContainerDied","Data":"24420da62070a51e6a7078c855f70d8ac227fa3cb1925b98155c54146d41d372"} Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.723925 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24420da62070a51e6a7078c855f70d8ac227fa3cb1925b98155c54146d41d372" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.723928 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-6ctfc" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.819289 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-lbssf"] Feb 19 23:29:38 crc kubenswrapper[4795]: E0219 23:29:38.819691 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41df3556-7d70-47f5-bd79-bec74fbd269c" containerName="install-os-openstack-openstack-cell1" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.819709 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="41df3556-7d70-47f5-bd79-bec74fbd269c" containerName="install-os-openstack-openstack-cell1" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.819932 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="41df3556-7d70-47f5-bd79-bec74fbd269c" containerName="install-os-openstack-openstack-cell1" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.820861 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.823239 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.824189 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.824772 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.825701 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.830213 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-lbssf"] Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.904528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqhfz\" (UniqueName: \"kubernetes.io/projected/0d850dd7-a1bb-42db-893b-b96eebee4c9c-kube-api-access-wqhfz\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.904903 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ceph\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.904937 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:38 crc kubenswrapper[4795]: I0219 23:29:38.904964 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-inventory\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.006386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.006447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-inventory\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.006553 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqhfz\" (UniqueName: \"kubernetes.io/projected/0d850dd7-a1bb-42db-893b-b96eebee4c9c-kube-api-access-wqhfz\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.006666 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ceph\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.013865 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-inventory\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.014179 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.021347 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ceph\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.027791 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqhfz\" (UniqueName: \"kubernetes.io/projected/0d850dd7-a1bb-42db-893b-b96eebee4c9c-kube-api-access-wqhfz\") pod \"configure-os-openstack-openstack-cell1-lbssf\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.139585 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:29:39 crc kubenswrapper[4795]: W0219 23:29:39.756476 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d850dd7_a1bb_42db_893b_b96eebee4c9c.slice/crio-1db73399971c5dcbeab910f2434559109ff2b9046cd72f0acc08f2f83a413b30 WatchSource:0}: Error finding container 1db73399971c5dcbeab910f2434559109ff2b9046cd72f0acc08f2f83a413b30: Status 404 returned error can't find the container with id 1db73399971c5dcbeab910f2434559109ff2b9046cd72f0acc08f2f83a413b30 Feb 19 23:29:39 crc kubenswrapper[4795]: I0219 23:29:39.758306 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-lbssf"] Feb 19 23:29:40 crc kubenswrapper[4795]: I0219 23:29:40.741354 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" event={"ID":"0d850dd7-a1bb-42db-893b-b96eebee4c9c","Type":"ContainerStarted","Data":"c9f6543c31eab5a72ff02a75a7c4ec250b61feb6ce31b2ae3d2ad84bde1fb437"} Feb 19 23:29:40 crc kubenswrapper[4795]: I0219 23:29:40.741712 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" event={"ID":"0d850dd7-a1bb-42db-893b-b96eebee4c9c","Type":"ContainerStarted","Data":"1db73399971c5dcbeab910f2434559109ff2b9046cd72f0acc08f2f83a413b30"} Feb 19 23:29:40 crc kubenswrapper[4795]: I0219 23:29:40.782030 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" podStartSLOduration=2.358657315 podStartE2EDuration="2.78201064s" podCreationTimestamp="2026-02-19 23:29:38 +0000 UTC" firstStartedPulling="2026-02-19 23:29:39.759883952 +0000 UTC m=+7290.952401826" lastFinishedPulling="2026-02-19 23:29:40.183237287 +0000 UTC m=+7291.375755151" observedRunningTime="2026-02-19 23:29:40.754502429 +0000 UTC m=+7291.947020293" watchObservedRunningTime="2026-02-19 23:29:40.78201064 +0000 UTC m=+7291.974528504" Feb 19 23:29:48 crc kubenswrapper[4795]: I0219 23:29:48.511959 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:29:48 crc kubenswrapper[4795]: E0219 23:29:48.512670 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.153893 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n"] Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.156735 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.158840 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.159188 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.162652 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n"] Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.228934 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ac63769-08cc-4f7e-9015-59c2df94d39f-secret-volume\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.229117 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ac63769-08cc-4f7e-9015-59c2df94d39f-config-volume\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.229266 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4nst\" (UniqueName: \"kubernetes.io/projected/4ac63769-08cc-4f7e-9015-59c2df94d39f-kube-api-access-m4nst\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.330882 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ac63769-08cc-4f7e-9015-59c2df94d39f-config-volume\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.330988 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4nst\" (UniqueName: \"kubernetes.io/projected/4ac63769-08cc-4f7e-9015-59c2df94d39f-kube-api-access-m4nst\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.331043 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ac63769-08cc-4f7e-9015-59c2df94d39f-secret-volume\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.332040 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ac63769-08cc-4f7e-9015-59c2df94d39f-config-volume\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.338238 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ac63769-08cc-4f7e-9015-59c2df94d39f-secret-volume\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.348791 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4nst\" (UniqueName: \"kubernetes.io/projected/4ac63769-08cc-4f7e-9015-59c2df94d39f-kube-api-access-m4nst\") pod \"collect-profiles-29525730-mzs4n\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.530346 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:00 crc kubenswrapper[4795]: I0219 23:30:00.988095 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n"] Feb 19 23:30:01 crc kubenswrapper[4795]: I0219 23:30:01.942502 4795 generic.go:334] "Generic (PLEG): container finished" podID="4ac63769-08cc-4f7e-9015-59c2df94d39f" containerID="977244ee3537e5139e2881adcfaafbf007340a13c0343e3c88ee23e7e14c9ddf" exitCode=0 Feb 19 23:30:01 crc kubenswrapper[4795]: I0219 23:30:01.942656 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" event={"ID":"4ac63769-08cc-4f7e-9015-59c2df94d39f","Type":"ContainerDied","Data":"977244ee3537e5139e2881adcfaafbf007340a13c0343e3c88ee23e7e14c9ddf"} Feb 19 23:30:01 crc kubenswrapper[4795]: I0219 23:30:01.943324 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" event={"ID":"4ac63769-08cc-4f7e-9015-59c2df94d39f","Type":"ContainerStarted","Data":"28272af00a87c52e2b8764b7a7c16cbbc76ba88c3db82f0f8bfc9356174e7b6b"} Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.313059 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.395792 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4nst\" (UniqueName: \"kubernetes.io/projected/4ac63769-08cc-4f7e-9015-59c2df94d39f-kube-api-access-m4nst\") pod \"4ac63769-08cc-4f7e-9015-59c2df94d39f\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.395895 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ac63769-08cc-4f7e-9015-59c2df94d39f-secret-volume\") pod \"4ac63769-08cc-4f7e-9015-59c2df94d39f\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.395946 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ac63769-08cc-4f7e-9015-59c2df94d39f-config-volume\") pod \"4ac63769-08cc-4f7e-9015-59c2df94d39f\" (UID: \"4ac63769-08cc-4f7e-9015-59c2df94d39f\") " Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.397132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac63769-08cc-4f7e-9015-59c2df94d39f-config-volume" (OuterVolumeSpecName: "config-volume") pod "4ac63769-08cc-4f7e-9015-59c2df94d39f" (UID: "4ac63769-08cc-4f7e-9015-59c2df94d39f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.401231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ac63769-08cc-4f7e-9015-59c2df94d39f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4ac63769-08cc-4f7e-9015-59c2df94d39f" (UID: "4ac63769-08cc-4f7e-9015-59c2df94d39f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.419754 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ac63769-08cc-4f7e-9015-59c2df94d39f-kube-api-access-m4nst" (OuterVolumeSpecName: "kube-api-access-m4nst") pod "4ac63769-08cc-4f7e-9015-59c2df94d39f" (UID: "4ac63769-08cc-4f7e-9015-59c2df94d39f"). InnerVolumeSpecName "kube-api-access-m4nst". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.499675 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4nst\" (UniqueName: \"kubernetes.io/projected/4ac63769-08cc-4f7e-9015-59c2df94d39f-kube-api-access-m4nst\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.499705 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ac63769-08cc-4f7e-9015-59c2df94d39f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.499717 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ac63769-08cc-4f7e-9015-59c2df94d39f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.511977 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.969598 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"53ab375a2fc3fa1ef244b68bab3723d6db871a1c5fc51b2d8dec9ed6289bc190"} Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.972472 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" event={"ID":"4ac63769-08cc-4f7e-9015-59c2df94d39f","Type":"ContainerDied","Data":"28272af00a87c52e2b8764b7a7c16cbbc76ba88c3db82f0f8bfc9356174e7b6b"} Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.972523 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28272af00a87c52e2b8764b7a7c16cbbc76ba88c3db82f0f8bfc9356174e7b6b" Feb 19 23:30:03 crc kubenswrapper[4795]: I0219 23:30:03.972550 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525730-mzs4n" Feb 19 23:30:04 crc kubenswrapper[4795]: I0219 23:30:04.388251 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd"] Feb 19 23:30:04 crc kubenswrapper[4795]: I0219 23:30:04.396961 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525685-qp6fd"] Feb 19 23:30:05 crc kubenswrapper[4795]: I0219 23:30:05.532137 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54bacd9c-6bce-433c-972c-3990566baa40" path="/var/lib/kubelet/pods/54bacd9c-6bce-433c-972c-3990566baa40/volumes" Feb 19 23:30:24 crc kubenswrapper[4795]: I0219 23:30:24.145422 4795 generic.go:334] "Generic (PLEG): container finished" podID="0d850dd7-a1bb-42db-893b-b96eebee4c9c" containerID="c9f6543c31eab5a72ff02a75a7c4ec250b61feb6ce31b2ae3d2ad84bde1fb437" exitCode=0 Feb 19 23:30:24 crc kubenswrapper[4795]: I0219 23:30:24.145523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" event={"ID":"0d850dd7-a1bb-42db-893b-b96eebee4c9c","Type":"ContainerDied","Data":"c9f6543c31eab5a72ff02a75a7c4ec250b61feb6ce31b2ae3d2ad84bde1fb437"} Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.702984 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.802498 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ssh-key-openstack-cell1\") pod \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.802589 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-inventory\") pod \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.802704 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ceph\") pod \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.802738 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqhfz\" (UniqueName: \"kubernetes.io/projected/0d850dd7-a1bb-42db-893b-b96eebee4c9c-kube-api-access-wqhfz\") pod \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\" (UID: \"0d850dd7-a1bb-42db-893b-b96eebee4c9c\") " Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.807763 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ceph" (OuterVolumeSpecName: "ceph") pod "0d850dd7-a1bb-42db-893b-b96eebee4c9c" (UID: "0d850dd7-a1bb-42db-893b-b96eebee4c9c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.810310 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d850dd7-a1bb-42db-893b-b96eebee4c9c-kube-api-access-wqhfz" (OuterVolumeSpecName: "kube-api-access-wqhfz") pod "0d850dd7-a1bb-42db-893b-b96eebee4c9c" (UID: "0d850dd7-a1bb-42db-893b-b96eebee4c9c"). InnerVolumeSpecName "kube-api-access-wqhfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.836426 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0d850dd7-a1bb-42db-893b-b96eebee4c9c" (UID: "0d850dd7-a1bb-42db-893b-b96eebee4c9c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.837686 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-inventory" (OuterVolumeSpecName: "inventory") pod "0d850dd7-a1bb-42db-893b-b96eebee4c9c" (UID: "0d850dd7-a1bb-42db-893b-b96eebee4c9c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.905114 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.905146 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.905155 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d850dd7-a1bb-42db-893b-b96eebee4c9c-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:25 crc kubenswrapper[4795]: I0219 23:30:25.905179 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqhfz\" (UniqueName: \"kubernetes.io/projected/0d850dd7-a1bb-42db-893b-b96eebee4c9c-kube-api-access-wqhfz\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.174746 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" event={"ID":"0d850dd7-a1bb-42db-893b-b96eebee4c9c","Type":"ContainerDied","Data":"1db73399971c5dcbeab910f2434559109ff2b9046cd72f0acc08f2f83a413b30"} Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.174793 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1db73399971c5dcbeab910f2434559109ff2b9046cd72f0acc08f2f83a413b30" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.174824 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-lbssf" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.251395 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-qzhll"] Feb 19 23:30:26 crc kubenswrapper[4795]: E0219 23:30:26.251804 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d850dd7-a1bb-42db-893b-b96eebee4c9c" containerName="configure-os-openstack-openstack-cell1" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.251819 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d850dd7-a1bb-42db-893b-b96eebee4c9c" containerName="configure-os-openstack-openstack-cell1" Feb 19 23:30:26 crc kubenswrapper[4795]: E0219 23:30:26.251849 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac63769-08cc-4f7e-9015-59c2df94d39f" containerName="collect-profiles" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.251855 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac63769-08cc-4f7e-9015-59c2df94d39f" containerName="collect-profiles" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.252084 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d850dd7-a1bb-42db-893b-b96eebee4c9c" containerName="configure-os-openstack-openstack-cell1" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.252107 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ac63769-08cc-4f7e-9015-59c2df94d39f" containerName="collect-profiles" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.252861 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.254743 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.254881 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.255629 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.255922 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.264586 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-qzhll"] Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.415218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l6mg\" (UniqueName: \"kubernetes.io/projected/adb280f6-14e8-45d3-91a1-1bf325d84aef-kube-api-access-5l6mg\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.415298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-inventory-0\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.415358 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ceph\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.415645 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.518119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l6mg\" (UniqueName: \"kubernetes.io/projected/adb280f6-14e8-45d3-91a1-1bf325d84aef-kube-api-access-5l6mg\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.518202 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-inventory-0\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.518276 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ceph\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.518392 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.532899 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-inventory-0\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.532899 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.533280 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ceph\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.552545 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l6mg\" (UniqueName: \"kubernetes.io/projected/adb280f6-14e8-45d3-91a1-1bf325d84aef-kube-api-access-5l6mg\") pod \"ssh-known-hosts-openstack-qzhll\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:26 crc kubenswrapper[4795]: I0219 23:30:26.573703 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:27 crc kubenswrapper[4795]: I0219 23:30:27.142668 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-qzhll"] Feb 19 23:30:27 crc kubenswrapper[4795]: I0219 23:30:27.189135 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-qzhll" event={"ID":"adb280f6-14e8-45d3-91a1-1bf325d84aef","Type":"ContainerStarted","Data":"b55a3be4c08a80f7a0cdc33ce789154e9bfb621b45fd7b98bc130e28d94b7534"} Feb 19 23:30:28 crc kubenswrapper[4795]: I0219 23:30:28.118015 4795 scope.go:117] "RemoveContainer" containerID="eca52f37003a7a5168093ed1a2726d31a52843bd3e99b81128785c0ad70b60e9" Feb 19 23:30:28 crc kubenswrapper[4795]: I0219 23:30:28.206679 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-qzhll" event={"ID":"adb280f6-14e8-45d3-91a1-1bf325d84aef","Type":"ContainerStarted","Data":"4d38e981123d5faed9cee0f1f24700eb07147e276a988944bb7acaad976c3c58"} Feb 19 23:30:28 crc kubenswrapper[4795]: I0219 23:30:28.237458 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-qzhll" podStartSLOduration=1.779926218 podStartE2EDuration="2.237434072s" podCreationTimestamp="2026-02-19 23:30:26 +0000 UTC" firstStartedPulling="2026-02-19 23:30:27.146572245 +0000 UTC m=+7338.339090109" lastFinishedPulling="2026-02-19 23:30:27.604080099 +0000 UTC m=+7338.796597963" observedRunningTime="2026-02-19 23:30:28.227312115 +0000 UTC m=+7339.419829979" watchObservedRunningTime="2026-02-19 23:30:28.237434072 +0000 UTC m=+7339.429951946" Feb 19 23:30:36 crc kubenswrapper[4795]: I0219 23:30:36.313991 4795 generic.go:334] "Generic (PLEG): container finished" podID="adb280f6-14e8-45d3-91a1-1bf325d84aef" containerID="4d38e981123d5faed9cee0f1f24700eb07147e276a988944bb7acaad976c3c58" exitCode=0 Feb 19 23:30:36 crc kubenswrapper[4795]: I0219 23:30:36.314547 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-qzhll" event={"ID":"adb280f6-14e8-45d3-91a1-1bf325d84aef","Type":"ContainerDied","Data":"4d38e981123d5faed9cee0f1f24700eb07147e276a988944bb7acaad976c3c58"} Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.766467 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.900076 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l6mg\" (UniqueName: \"kubernetes.io/projected/adb280f6-14e8-45d3-91a1-1bf325d84aef-kube-api-access-5l6mg\") pod \"adb280f6-14e8-45d3-91a1-1bf325d84aef\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.900480 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ssh-key-openstack-cell1\") pod \"adb280f6-14e8-45d3-91a1-1bf325d84aef\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.900513 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-inventory-0\") pod \"adb280f6-14e8-45d3-91a1-1bf325d84aef\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.900588 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ceph\") pod \"adb280f6-14e8-45d3-91a1-1bf325d84aef\" (UID: \"adb280f6-14e8-45d3-91a1-1bf325d84aef\") " Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.905198 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ceph" (OuterVolumeSpecName: "ceph") pod "adb280f6-14e8-45d3-91a1-1bf325d84aef" (UID: "adb280f6-14e8-45d3-91a1-1bf325d84aef"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.905383 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adb280f6-14e8-45d3-91a1-1bf325d84aef-kube-api-access-5l6mg" (OuterVolumeSpecName: "kube-api-access-5l6mg") pod "adb280f6-14e8-45d3-91a1-1bf325d84aef" (UID: "adb280f6-14e8-45d3-91a1-1bf325d84aef"). InnerVolumeSpecName "kube-api-access-5l6mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.934973 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "adb280f6-14e8-45d3-91a1-1bf325d84aef" (UID: "adb280f6-14e8-45d3-91a1-1bf325d84aef"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:37 crc kubenswrapper[4795]: I0219 23:30:37.936336 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "adb280f6-14e8-45d3-91a1-1bf325d84aef" (UID: "adb280f6-14e8-45d3-91a1-1bf325d84aef"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.006382 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l6mg\" (UniqueName: \"kubernetes.io/projected/adb280f6-14e8-45d3-91a1-1bf325d84aef-kube-api-access-5l6mg\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.006420 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.006435 4795 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.006449 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/adb280f6-14e8-45d3-91a1-1bf325d84aef-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.333973 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-qzhll" event={"ID":"adb280f6-14e8-45d3-91a1-1bf325d84aef","Type":"ContainerDied","Data":"b55a3be4c08a80f7a0cdc33ce789154e9bfb621b45fd7b98bc130e28d94b7534"} Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.334234 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b55a3be4c08a80f7a0cdc33ce789154e9bfb621b45fd7b98bc130e28d94b7534" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.334283 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-qzhll" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.408188 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-7vb2g"] Feb 19 23:30:38 crc kubenswrapper[4795]: E0219 23:30:38.408630 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb280f6-14e8-45d3-91a1-1bf325d84aef" containerName="ssh-known-hosts-openstack" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.408646 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb280f6-14e8-45d3-91a1-1bf325d84aef" containerName="ssh-known-hosts-openstack" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.408842 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="adb280f6-14e8-45d3-91a1-1bf325d84aef" containerName="ssh-known-hosts-openstack" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.409638 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.414572 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.414864 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.414990 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.415126 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.418943 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-7vb2g"] Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.517348 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.517765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qdsw\" (UniqueName: \"kubernetes.io/projected/d82522ab-bf1a-47f9-902b-c82105b5d09b-kube-api-access-8qdsw\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.517892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ceph\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.518062 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-inventory\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.620348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.620548 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qdsw\" (UniqueName: \"kubernetes.io/projected/d82522ab-bf1a-47f9-902b-c82105b5d09b-kube-api-access-8qdsw\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.620596 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ceph\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.620719 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-inventory\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.625783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-inventory\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.627405 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ceph\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.629638 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.635057 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qdsw\" (UniqueName: \"kubernetes.io/projected/d82522ab-bf1a-47f9-902b-c82105b5d09b-kube-api-access-8qdsw\") pod \"run-os-openstack-openstack-cell1-7vb2g\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:38 crc kubenswrapper[4795]: I0219 23:30:38.728185 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:39 crc kubenswrapper[4795]: I0219 23:30:39.245542 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-7vb2g"] Feb 19 23:30:39 crc kubenswrapper[4795]: I0219 23:30:39.344849 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" event={"ID":"d82522ab-bf1a-47f9-902b-c82105b5d09b","Type":"ContainerStarted","Data":"59fe3a0debe2d123dc2daa1bb01aef0814263fc901e65de780bd139ee8c3917e"} Feb 19 23:30:40 crc kubenswrapper[4795]: I0219 23:30:40.355868 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" event={"ID":"d82522ab-bf1a-47f9-902b-c82105b5d09b","Type":"ContainerStarted","Data":"793f0b98692e634ad7a64e2338c47b0d5c616ca2233e547ea2a72b28316fe605"} Feb 19 23:30:40 crc kubenswrapper[4795]: I0219 23:30:40.378306 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" podStartSLOduration=1.941187782 podStartE2EDuration="2.378289206s" podCreationTimestamp="2026-02-19 23:30:38 +0000 UTC" firstStartedPulling="2026-02-19 23:30:39.252033364 +0000 UTC m=+7350.444551228" lastFinishedPulling="2026-02-19 23:30:39.689134788 +0000 UTC m=+7350.881652652" observedRunningTime="2026-02-19 23:30:40.369827566 +0000 UTC m=+7351.562345430" watchObservedRunningTime="2026-02-19 23:30:40.378289206 +0000 UTC m=+7351.570807060" Feb 19 23:30:49 crc kubenswrapper[4795]: I0219 23:30:49.457601 4795 generic.go:334] "Generic (PLEG): container finished" podID="d82522ab-bf1a-47f9-902b-c82105b5d09b" containerID="793f0b98692e634ad7a64e2338c47b0d5c616ca2233e547ea2a72b28316fe605" exitCode=0 Feb 19 23:30:49 crc kubenswrapper[4795]: I0219 23:30:49.457902 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" event={"ID":"d82522ab-bf1a-47f9-902b-c82105b5d09b","Type":"ContainerDied","Data":"793f0b98692e634ad7a64e2338c47b0d5c616ca2233e547ea2a72b28316fe605"} Feb 19 23:30:50 crc kubenswrapper[4795]: I0219 23:30:50.928945 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.020894 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ceph\") pod \"d82522ab-bf1a-47f9-902b-c82105b5d09b\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.021368 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qdsw\" (UniqueName: \"kubernetes.io/projected/d82522ab-bf1a-47f9-902b-c82105b5d09b-kube-api-access-8qdsw\") pod \"d82522ab-bf1a-47f9-902b-c82105b5d09b\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.021477 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-inventory\") pod \"d82522ab-bf1a-47f9-902b-c82105b5d09b\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.021566 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ssh-key-openstack-cell1\") pod \"d82522ab-bf1a-47f9-902b-c82105b5d09b\" (UID: \"d82522ab-bf1a-47f9-902b-c82105b5d09b\") " Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.027274 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d82522ab-bf1a-47f9-902b-c82105b5d09b-kube-api-access-8qdsw" (OuterVolumeSpecName: "kube-api-access-8qdsw") pod "d82522ab-bf1a-47f9-902b-c82105b5d09b" (UID: "d82522ab-bf1a-47f9-902b-c82105b5d09b"). InnerVolumeSpecName "kube-api-access-8qdsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.027389 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ceph" (OuterVolumeSpecName: "ceph") pod "d82522ab-bf1a-47f9-902b-c82105b5d09b" (UID: "d82522ab-bf1a-47f9-902b-c82105b5d09b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.054027 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-inventory" (OuterVolumeSpecName: "inventory") pod "d82522ab-bf1a-47f9-902b-c82105b5d09b" (UID: "d82522ab-bf1a-47f9-902b-c82105b5d09b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.054546 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d82522ab-bf1a-47f9-902b-c82105b5d09b" (UID: "d82522ab-bf1a-47f9-902b-c82105b5d09b"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.125499 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qdsw\" (UniqueName: \"kubernetes.io/projected/d82522ab-bf1a-47f9-902b-c82105b5d09b-kube-api-access-8qdsw\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.125720 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.125855 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.125979 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d82522ab-bf1a-47f9-902b-c82105b5d09b-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.485061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" event={"ID":"d82522ab-bf1a-47f9-902b-c82105b5d09b","Type":"ContainerDied","Data":"59fe3a0debe2d123dc2daa1bb01aef0814263fc901e65de780bd139ee8c3917e"} Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.485646 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59fe3a0debe2d123dc2daa1bb01aef0814263fc901e65de780bd139ee8c3917e" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.485099 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-7vb2g" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.578880 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-bdhcf"] Feb 19 23:30:51 crc kubenswrapper[4795]: E0219 23:30:51.579877 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82522ab-bf1a-47f9-902b-c82105b5d09b" containerName="run-os-openstack-openstack-cell1" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.579905 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82522ab-bf1a-47f9-902b-c82105b5d09b" containerName="run-os-openstack-openstack-cell1" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.580249 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d82522ab-bf1a-47f9-902b-c82105b5d09b" containerName="run-os-openstack-openstack-cell1" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.581444 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.625418 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.625603 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.625624 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.625753 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.645496 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-bdhcf"] Feb 19 23:30:51 crc kubenswrapper[4795]: E0219 23:30:51.707009 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd82522ab_bf1a_47f9_902b_c82105b5d09b.slice/crio-59fe3a0debe2d123dc2daa1bb01aef0814263fc901e65de780bd139ee8c3917e\": RecentStats: unable to find data in memory cache]" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.756519 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c59dx\" (UniqueName: \"kubernetes.io/projected/99fb1ef3-d414-4a7e-9db8-54edf1aad197-kube-api-access-c59dx\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.756615 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ceph\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.756643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.756662 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-inventory\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.858405 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c59dx\" (UniqueName: \"kubernetes.io/projected/99fb1ef3-d414-4a7e-9db8-54edf1aad197-kube-api-access-c59dx\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.858507 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ceph\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.858529 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.858552 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-inventory\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.862601 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-inventory\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.863048 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ceph\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.867036 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.873280 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c59dx\" (UniqueName: \"kubernetes.io/projected/99fb1ef3-d414-4a7e-9db8-54edf1aad197-kube-api-access-c59dx\") pod \"reboot-os-openstack-openstack-cell1-bdhcf\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:51 crc kubenswrapper[4795]: I0219 23:30:51.956792 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:30:52 crc kubenswrapper[4795]: I0219 23:30:52.512520 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-bdhcf"] Feb 19 23:30:52 crc kubenswrapper[4795]: I0219 23:30:52.523875 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:30:53 crc kubenswrapper[4795]: I0219 23:30:53.502341 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" event={"ID":"99fb1ef3-d414-4a7e-9db8-54edf1aad197","Type":"ContainerStarted","Data":"0790acf7e75e4eda1a7132acb1d243e79b8c530ec6b23a04ac489a100963054d"} Feb 19 23:30:53 crc kubenswrapper[4795]: I0219 23:30:53.502635 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" event={"ID":"99fb1ef3-d414-4a7e-9db8-54edf1aad197","Type":"ContainerStarted","Data":"25ca7e89d2c644a930ea04fc731c49c656a8f3e6c490c2adb3ae60c32b542bfd"} Feb 19 23:30:53 crc kubenswrapper[4795]: I0219 23:30:53.531015 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" podStartSLOduration=2.076738852 podStartE2EDuration="2.53100168s" podCreationTimestamp="2026-02-19 23:30:51 +0000 UTC" firstStartedPulling="2026-02-19 23:30:52.523616667 +0000 UTC m=+7363.716134551" lastFinishedPulling="2026-02-19 23:30:52.977879515 +0000 UTC m=+7364.170397379" observedRunningTime="2026-02-19 23:30:53.527903243 +0000 UTC m=+7364.720421107" watchObservedRunningTime="2026-02-19 23:30:53.53100168 +0000 UTC m=+7364.723519544" Feb 19 23:31:05 crc kubenswrapper[4795]: I0219 23:31:05.749193 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4jw6z"] Feb 19 23:31:05 crc kubenswrapper[4795]: I0219 23:31:05.754850 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:05 crc kubenswrapper[4795]: I0219 23:31:05.761473 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4jw6z"] Feb 19 23:31:05 crc kubenswrapper[4795]: I0219 23:31:05.919684 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-catalog-content\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:05 crc kubenswrapper[4795]: I0219 23:31:05.920110 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmnd8\" (UniqueName: \"kubernetes.io/projected/30103546-d69d-4d13-a174-02fa1187e597-kube-api-access-xmnd8\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:05 crc kubenswrapper[4795]: I0219 23:31:05.920213 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-utilities\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.022635 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmnd8\" (UniqueName: \"kubernetes.io/projected/30103546-d69d-4d13-a174-02fa1187e597-kube-api-access-xmnd8\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.022766 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-utilities\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.022854 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-catalog-content\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.023300 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-catalog-content\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.023485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-utilities\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.051792 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmnd8\" (UniqueName: \"kubernetes.io/projected/30103546-d69d-4d13-a174-02fa1187e597-kube-api-access-xmnd8\") pod \"community-operators-4jw6z\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.110458 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.663419 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4jw6z"] Feb 19 23:31:06 crc kubenswrapper[4795]: I0219 23:31:06.683731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jw6z" event={"ID":"30103546-d69d-4d13-a174-02fa1187e597","Type":"ContainerStarted","Data":"28b22d09111ffcca0542200937ddfd1229cba19345a04a67d12fa0c5bb260243"} Feb 19 23:31:07 crc kubenswrapper[4795]: I0219 23:31:07.711118 4795 generic.go:334] "Generic (PLEG): container finished" podID="30103546-d69d-4d13-a174-02fa1187e597" containerID="6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7" exitCode=0 Feb 19 23:31:07 crc kubenswrapper[4795]: I0219 23:31:07.711251 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jw6z" event={"ID":"30103546-d69d-4d13-a174-02fa1187e597","Type":"ContainerDied","Data":"6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7"} Feb 19 23:31:08 crc kubenswrapper[4795]: I0219 23:31:08.722820 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jw6z" event={"ID":"30103546-d69d-4d13-a174-02fa1187e597","Type":"ContainerStarted","Data":"0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217"} Feb 19 23:31:08 crc kubenswrapper[4795]: I0219 23:31:08.724340 4795 generic.go:334] "Generic (PLEG): container finished" podID="99fb1ef3-d414-4a7e-9db8-54edf1aad197" containerID="0790acf7e75e4eda1a7132acb1d243e79b8c530ec6b23a04ac489a100963054d" exitCode=0 Feb 19 23:31:08 crc kubenswrapper[4795]: I0219 23:31:08.724385 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" event={"ID":"99fb1ef3-d414-4a7e-9db8-54edf1aad197","Type":"ContainerDied","Data":"0790acf7e75e4eda1a7132acb1d243e79b8c530ec6b23a04ac489a100963054d"} Feb 19 23:31:09 crc kubenswrapper[4795]: I0219 23:31:09.738889 4795 generic.go:334] "Generic (PLEG): container finished" podID="30103546-d69d-4d13-a174-02fa1187e597" containerID="0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217" exitCode=0 Feb 19 23:31:09 crc kubenswrapper[4795]: I0219 23:31:09.738938 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jw6z" event={"ID":"30103546-d69d-4d13-a174-02fa1187e597","Type":"ContainerDied","Data":"0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217"} Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.239656 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.411791 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-inventory\") pod \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.412604 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ssh-key-openstack-cell1\") pod \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.412845 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ceph\") pod \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.413004 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c59dx\" (UniqueName: \"kubernetes.io/projected/99fb1ef3-d414-4a7e-9db8-54edf1aad197-kube-api-access-c59dx\") pod \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\" (UID: \"99fb1ef3-d414-4a7e-9db8-54edf1aad197\") " Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.417092 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ceph" (OuterVolumeSpecName: "ceph") pod "99fb1ef3-d414-4a7e-9db8-54edf1aad197" (UID: "99fb1ef3-d414-4a7e-9db8-54edf1aad197"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.417708 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99fb1ef3-d414-4a7e-9db8-54edf1aad197-kube-api-access-c59dx" (OuterVolumeSpecName: "kube-api-access-c59dx") pod "99fb1ef3-d414-4a7e-9db8-54edf1aad197" (UID: "99fb1ef3-d414-4a7e-9db8-54edf1aad197"). InnerVolumeSpecName "kube-api-access-c59dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.438927 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-inventory" (OuterVolumeSpecName: "inventory") pod "99fb1ef3-d414-4a7e-9db8-54edf1aad197" (UID: "99fb1ef3-d414-4a7e-9db8-54edf1aad197"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.453447 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "99fb1ef3-d414-4a7e-9db8-54edf1aad197" (UID: "99fb1ef3-d414-4a7e-9db8-54edf1aad197"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.516157 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.516205 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.516218 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c59dx\" (UniqueName: \"kubernetes.io/projected/99fb1ef3-d414-4a7e-9db8-54edf1aad197-kube-api-access-c59dx\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.516228 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99fb1ef3-d414-4a7e-9db8-54edf1aad197-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.748796 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jw6z" event={"ID":"30103546-d69d-4d13-a174-02fa1187e597","Type":"ContainerStarted","Data":"70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e"} Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.751478 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" event={"ID":"99fb1ef3-d414-4a7e-9db8-54edf1aad197","Type":"ContainerDied","Data":"25ca7e89d2c644a930ea04fc731c49c656a8f3e6c490c2adb3ae60c32b542bfd"} Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.751520 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-bdhcf" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.751528 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25ca7e89d2c644a930ea04fc731c49c656a8f3e6c490c2adb3ae60c32b542bfd" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.776147 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4jw6z" podStartSLOduration=3.249008946 podStartE2EDuration="5.776130591s" podCreationTimestamp="2026-02-19 23:31:05 +0000 UTC" firstStartedPulling="2026-02-19 23:31:07.716938264 +0000 UTC m=+7378.909456138" lastFinishedPulling="2026-02-19 23:31:10.244059919 +0000 UTC m=+7381.436577783" observedRunningTime="2026-02-19 23:31:10.765392138 +0000 UTC m=+7381.957910012" watchObservedRunningTime="2026-02-19 23:31:10.776130591 +0000 UTC m=+7381.968648455" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.853385 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-tjvng"] Feb 19 23:31:10 crc kubenswrapper[4795]: E0219 23:31:10.853898 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99fb1ef3-d414-4a7e-9db8-54edf1aad197" containerName="reboot-os-openstack-openstack-cell1" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.853921 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="99fb1ef3-d414-4a7e-9db8-54edf1aad197" containerName="reboot-os-openstack-openstack-cell1" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.854179 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="99fb1ef3-d414-4a7e-9db8-54edf1aad197" containerName="reboot-os-openstack-openstack-cell1" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.855068 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.857062 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.857231 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.857548 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.857707 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:31:10 crc kubenswrapper[4795]: I0219 23:31:10.888198 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-tjvng"] Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027317 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-inventory\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027390 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027413 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqddm\" (UniqueName: \"kubernetes.io/projected/dc08b8d0-e577-4674-9ca5-b1a02818725c-kube-api-access-mqddm\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027440 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027483 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027569 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027584 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027627 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.027689 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ceph\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129610 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129710 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129759 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129884 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ceph\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129915 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-inventory\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129979 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqddm\" (UniqueName: \"kubernetes.io/projected/dc08b8d0-e577-4674-9ca5-b1a02818725c-kube-api-access-mqddm\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.129999 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.130027 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.130091 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.138333 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ceph\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.138590 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.139004 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.141080 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.144098 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.152384 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-inventory\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.152731 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.153427 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.157156 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.162048 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqddm\" (UniqueName: \"kubernetes.io/projected/dc08b8d0-e577-4674-9ca5-b1a02818725c-kube-api-access-mqddm\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.162313 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.163097 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-tjvng\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.191119 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.716343 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-tjvng"] Feb 19 23:31:11 crc kubenswrapper[4795]: W0219 23:31:11.732546 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc08b8d0_e577_4674_9ca5_b1a02818725c.slice/crio-3f7f392e235143b77368665dd884931fc858bf9b92d56ef3f16ad3d13b1da6e9 WatchSource:0}: Error finding container 3f7f392e235143b77368665dd884931fc858bf9b92d56ef3f16ad3d13b1da6e9: Status 404 returned error can't find the container with id 3f7f392e235143b77368665dd884931fc858bf9b92d56ef3f16ad3d13b1da6e9 Feb 19 23:31:11 crc kubenswrapper[4795]: I0219 23:31:11.760834 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" event={"ID":"dc08b8d0-e577-4674-9ca5-b1a02818725c","Type":"ContainerStarted","Data":"3f7f392e235143b77368665dd884931fc858bf9b92d56ef3f16ad3d13b1da6e9"} Feb 19 23:31:12 crc kubenswrapper[4795]: I0219 23:31:12.773593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" event={"ID":"dc08b8d0-e577-4674-9ca5-b1a02818725c","Type":"ContainerStarted","Data":"99751b63fc1cb5589f18af4e12bb716b9914210a385140ae6ce01e4b1797e3f7"} Feb 19 23:31:12 crc kubenswrapper[4795]: I0219 23:31:12.798715 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" podStartSLOduration=2.341933919 podStartE2EDuration="2.798695768s" podCreationTimestamp="2026-02-19 23:31:10 +0000 UTC" firstStartedPulling="2026-02-19 23:31:11.736492348 +0000 UTC m=+7382.929010212" lastFinishedPulling="2026-02-19 23:31:12.193254177 +0000 UTC m=+7383.385772061" observedRunningTime="2026-02-19 23:31:12.792067062 +0000 UTC m=+7383.984584936" watchObservedRunningTime="2026-02-19 23:31:12.798695768 +0000 UTC m=+7383.991213632" Feb 19 23:31:16 crc kubenswrapper[4795]: I0219 23:31:16.111461 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:16 crc kubenswrapper[4795]: I0219 23:31:16.112055 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:16 crc kubenswrapper[4795]: I0219 23:31:16.159779 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:16 crc kubenswrapper[4795]: I0219 23:31:16.856641 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:16 crc kubenswrapper[4795]: I0219 23:31:16.898900 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4jw6z"] Feb 19 23:31:18 crc kubenswrapper[4795]: I0219 23:31:18.827805 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4jw6z" podUID="30103546-d69d-4d13-a174-02fa1187e597" containerName="registry-server" containerID="cri-o://70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e" gracePeriod=2 Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.353545 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.418119 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmnd8\" (UniqueName: \"kubernetes.io/projected/30103546-d69d-4d13-a174-02fa1187e597-kube-api-access-xmnd8\") pod \"30103546-d69d-4d13-a174-02fa1187e597\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.418459 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-utilities\") pod \"30103546-d69d-4d13-a174-02fa1187e597\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.418553 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-catalog-content\") pod \"30103546-d69d-4d13-a174-02fa1187e597\" (UID: \"30103546-d69d-4d13-a174-02fa1187e597\") " Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.427027 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30103546-d69d-4d13-a174-02fa1187e597-kube-api-access-xmnd8" (OuterVolumeSpecName: "kube-api-access-xmnd8") pod "30103546-d69d-4d13-a174-02fa1187e597" (UID: "30103546-d69d-4d13-a174-02fa1187e597"). InnerVolumeSpecName "kube-api-access-xmnd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.431302 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-utilities" (OuterVolumeSpecName: "utilities") pod "30103546-d69d-4d13-a174-02fa1187e597" (UID: "30103546-d69d-4d13-a174-02fa1187e597"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.490408 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30103546-d69d-4d13-a174-02fa1187e597" (UID: "30103546-d69d-4d13-a174-02fa1187e597"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.524693 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.525076 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30103546-d69d-4d13-a174-02fa1187e597-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.525093 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmnd8\" (UniqueName: \"kubernetes.io/projected/30103546-d69d-4d13-a174-02fa1187e597-kube-api-access-xmnd8\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.840676 4795 generic.go:334] "Generic (PLEG): container finished" podID="30103546-d69d-4d13-a174-02fa1187e597" containerID="70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e" exitCode=0 Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.840727 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jw6z" event={"ID":"30103546-d69d-4d13-a174-02fa1187e597","Type":"ContainerDied","Data":"70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e"} Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.840756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4jw6z" event={"ID":"30103546-d69d-4d13-a174-02fa1187e597","Type":"ContainerDied","Data":"28b22d09111ffcca0542200937ddfd1229cba19345a04a67d12fa0c5bb260243"} Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.840773 4795 scope.go:117] "RemoveContainer" containerID="70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.840810 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4jw6z" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.876625 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4jw6z"] Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.886915 4795 scope.go:117] "RemoveContainer" containerID="0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.897873 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4jw6z"] Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.921199 4795 scope.go:117] "RemoveContainer" containerID="6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.955828 4795 scope.go:117] "RemoveContainer" containerID="70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e" Feb 19 23:31:19 crc kubenswrapper[4795]: E0219 23:31:19.956334 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e\": container with ID starting with 70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e not found: ID does not exist" containerID="70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.956455 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e"} err="failed to get container status \"70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e\": rpc error: code = NotFound desc = could not find container \"70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e\": container with ID starting with 70aa24356d7429627ef35aa91587e561fc9eef65e06cd4e84ebe1f25fdd8a61e not found: ID does not exist" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.956482 4795 scope.go:117] "RemoveContainer" containerID="0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217" Feb 19 23:31:19 crc kubenswrapper[4795]: E0219 23:31:19.957382 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217\": container with ID starting with 0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217 not found: ID does not exist" containerID="0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.957436 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217"} err="failed to get container status \"0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217\": rpc error: code = NotFound desc = could not find container \"0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217\": container with ID starting with 0a748c20bed635afacbf89f615ab99c17dc3036e08b3b76bdd24e0b202af8217 not found: ID does not exist" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.957456 4795 scope.go:117] "RemoveContainer" containerID="6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7" Feb 19 23:31:19 crc kubenswrapper[4795]: E0219 23:31:19.957768 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7\": container with ID starting with 6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7 not found: ID does not exist" containerID="6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7" Feb 19 23:31:19 crc kubenswrapper[4795]: I0219 23:31:19.957807 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7"} err="failed to get container status \"6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7\": rpc error: code = NotFound desc = could not find container \"6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7\": container with ID starting with 6320cb7221b284cfb2c72f7044097ab1e39d846ff23a6245278f7572a7eca7f7 not found: ID does not exist" Feb 19 23:31:21 crc kubenswrapper[4795]: I0219 23:31:21.525440 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30103546-d69d-4d13-a174-02fa1187e597" path="/var/lib/kubelet/pods/30103546-d69d-4d13-a174-02fa1187e597/volumes" Feb 19 23:31:31 crc kubenswrapper[4795]: I0219 23:31:31.979283 4795 generic.go:334] "Generic (PLEG): container finished" podID="dc08b8d0-e577-4674-9ca5-b1a02818725c" containerID="99751b63fc1cb5589f18af4e12bb716b9914210a385140ae6ce01e4b1797e3f7" exitCode=0 Feb 19 23:31:31 crc kubenswrapper[4795]: I0219 23:31:31.979390 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" event={"ID":"dc08b8d0-e577-4674-9ca5-b1a02818725c","Type":"ContainerDied","Data":"99751b63fc1cb5589f18af4e12bb716b9914210a385140ae6ce01e4b1797e3f7"} Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.473466 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.614809 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-inventory\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615083 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-metadata-combined-ca-bundle\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615185 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ovn-combined-ca-bundle\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615262 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-sriov-combined-ca-bundle\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615298 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqddm\" (UniqueName: \"kubernetes.io/projected/dc08b8d0-e577-4674-9ca5-b1a02818725c-kube-api-access-mqddm\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615347 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-bootstrap-combined-ca-bundle\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615364 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-nova-combined-ca-bundle\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615465 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-libvirt-combined-ca-bundle\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615547 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-telemetry-combined-ca-bundle\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615582 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ceph\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ssh-key-openstack-cell1\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.615651 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-dhcp-combined-ca-bundle\") pod \"dc08b8d0-e577-4674-9ca5-b1a02818725c\" (UID: \"dc08b8d0-e577-4674-9ca5-b1a02818725c\") " Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.622414 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.623070 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.623388 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.623958 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc08b8d0-e577-4674-9ca5-b1a02818725c-kube-api-access-mqddm" (OuterVolumeSpecName: "kube-api-access-mqddm") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "kube-api-access-mqddm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.624691 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ceph" (OuterVolumeSpecName: "ceph") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.624724 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.625072 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.626779 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.626909 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.627531 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.647110 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-inventory" (OuterVolumeSpecName: "inventory") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.660460 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "dc08b8d0-e577-4674-9ca5-b1a02818725c" (UID: "dc08b8d0-e577-4674-9ca5-b1a02818725c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.718904 4795 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.718941 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.718952 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.718963 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.718974 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.718984 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.718994 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.719002 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.719011 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqddm\" (UniqueName: \"kubernetes.io/projected/dc08b8d0-e577-4674-9ca5-b1a02818725c-kube-api-access-mqddm\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.719020 4795 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.719027 4795 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:33 crc kubenswrapper[4795]: I0219 23:31:33.719035 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc08b8d0-e577-4674-9ca5-b1a02818725c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.000803 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" event={"ID":"dc08b8d0-e577-4674-9ca5-b1a02818725c","Type":"ContainerDied","Data":"3f7f392e235143b77368665dd884931fc858bf9b92d56ef3f16ad3d13b1da6e9"} Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.000874 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f7f392e235143b77368665dd884931fc858bf9b92d56ef3f16ad3d13b1da6e9" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.000906 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-tjvng" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.086875 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-zjjhg"] Feb 19 23:31:34 crc kubenswrapper[4795]: E0219 23:31:34.087309 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30103546-d69d-4d13-a174-02fa1187e597" containerName="registry-server" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.087324 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30103546-d69d-4d13-a174-02fa1187e597" containerName="registry-server" Feb 19 23:31:34 crc kubenswrapper[4795]: E0219 23:31:34.087334 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc08b8d0-e577-4674-9ca5-b1a02818725c" containerName="install-certs-openstack-openstack-cell1" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.087341 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc08b8d0-e577-4674-9ca5-b1a02818725c" containerName="install-certs-openstack-openstack-cell1" Feb 19 23:31:34 crc kubenswrapper[4795]: E0219 23:31:34.087361 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30103546-d69d-4d13-a174-02fa1187e597" containerName="extract-utilities" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.087367 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30103546-d69d-4d13-a174-02fa1187e597" containerName="extract-utilities" Feb 19 23:31:34 crc kubenswrapper[4795]: E0219 23:31:34.087381 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30103546-d69d-4d13-a174-02fa1187e597" containerName="extract-content" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.087386 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="30103546-d69d-4d13-a174-02fa1187e597" containerName="extract-content" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.087578 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="30103546-d69d-4d13-a174-02fa1187e597" containerName="registry-server" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.087603 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc08b8d0-e577-4674-9ca5-b1a02818725c" containerName="install-certs-openstack-openstack-cell1" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.088498 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.093024 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.093240 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.093311 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.093350 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.107127 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-zjjhg"] Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.228113 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.228221 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6zjw\" (UniqueName: \"kubernetes.io/projected/532484aa-8294-4c2d-b257-082b09bafb14-kube-api-access-j6zjw\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.228320 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ceph\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.228379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-inventory\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.330730 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-inventory\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.330829 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.330902 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6zjw\" (UniqueName: \"kubernetes.io/projected/532484aa-8294-4c2d-b257-082b09bafb14-kube-api-access-j6zjw\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.331027 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ceph\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.334292 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-inventory\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.334817 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.336014 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ceph\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.347019 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6zjw\" (UniqueName: \"kubernetes.io/projected/532484aa-8294-4c2d-b257-082b09bafb14-kube-api-access-j6zjw\") pod \"ceph-client-openstack-openstack-cell1-zjjhg\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:34 crc kubenswrapper[4795]: I0219 23:31:34.406970 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:35 crc kubenswrapper[4795]: I0219 23:31:34.965145 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-zjjhg"] Feb 19 23:31:35 crc kubenswrapper[4795]: I0219 23:31:35.013109 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" event={"ID":"532484aa-8294-4c2d-b257-082b09bafb14","Type":"ContainerStarted","Data":"d2a5200be3bac39a3f54995b67133a5b52bf7a3d5949fc27caceb27198cda798"} Feb 19 23:31:36 crc kubenswrapper[4795]: I0219 23:31:36.024419 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" event={"ID":"532484aa-8294-4c2d-b257-082b09bafb14","Type":"ContainerStarted","Data":"33dcf59ba0f29e3e31575d7b58b985e030ba3d84d5bfa34eba4562c7de2dda67"} Feb 19 23:31:36 crc kubenswrapper[4795]: I0219 23:31:36.059902 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" podStartSLOduration=1.625393275 podStartE2EDuration="2.059881007s" podCreationTimestamp="2026-02-19 23:31:34 +0000 UTC" firstStartedPulling="2026-02-19 23:31:34.973857246 +0000 UTC m=+7406.166375120" lastFinishedPulling="2026-02-19 23:31:35.408344988 +0000 UTC m=+7406.600862852" observedRunningTime="2026-02-19 23:31:36.051342236 +0000 UTC m=+7407.243860100" watchObservedRunningTime="2026-02-19 23:31:36.059881007 +0000 UTC m=+7407.252398861" Feb 19 23:31:41 crc kubenswrapper[4795]: I0219 23:31:41.081139 4795 generic.go:334] "Generic (PLEG): container finished" podID="532484aa-8294-4c2d-b257-082b09bafb14" containerID="33dcf59ba0f29e3e31575d7b58b985e030ba3d84d5bfa34eba4562c7de2dda67" exitCode=0 Feb 19 23:31:41 crc kubenswrapper[4795]: I0219 23:31:41.081202 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" event={"ID":"532484aa-8294-4c2d-b257-082b09bafb14","Type":"ContainerDied","Data":"33dcf59ba0f29e3e31575d7b58b985e030ba3d84d5bfa34eba4562c7de2dda67"} Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.613898 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.717608 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6zjw\" (UniqueName: \"kubernetes.io/projected/532484aa-8294-4c2d-b257-082b09bafb14-kube-api-access-j6zjw\") pod \"532484aa-8294-4c2d-b257-082b09bafb14\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.718095 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ceph\") pod \"532484aa-8294-4c2d-b257-082b09bafb14\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.718216 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ssh-key-openstack-cell1\") pod \"532484aa-8294-4c2d-b257-082b09bafb14\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.718246 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-inventory\") pod \"532484aa-8294-4c2d-b257-082b09bafb14\" (UID: \"532484aa-8294-4c2d-b257-082b09bafb14\") " Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.723837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ceph" (OuterVolumeSpecName: "ceph") pod "532484aa-8294-4c2d-b257-082b09bafb14" (UID: "532484aa-8294-4c2d-b257-082b09bafb14"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.725873 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532484aa-8294-4c2d-b257-082b09bafb14-kube-api-access-j6zjw" (OuterVolumeSpecName: "kube-api-access-j6zjw") pod "532484aa-8294-4c2d-b257-082b09bafb14" (UID: "532484aa-8294-4c2d-b257-082b09bafb14"). InnerVolumeSpecName "kube-api-access-j6zjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.751835 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "532484aa-8294-4c2d-b257-082b09bafb14" (UID: "532484aa-8294-4c2d-b257-082b09bafb14"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.780801 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-inventory" (OuterVolumeSpecName: "inventory") pod "532484aa-8294-4c2d-b257-082b09bafb14" (UID: "532484aa-8294-4c2d-b257-082b09bafb14"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.820371 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6zjw\" (UniqueName: \"kubernetes.io/projected/532484aa-8294-4c2d-b257-082b09bafb14-kube-api-access-j6zjw\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.820400 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.820410 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:42 crc kubenswrapper[4795]: I0219 23:31:42.820420 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/532484aa-8294-4c2d-b257-082b09bafb14-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.101707 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" event={"ID":"532484aa-8294-4c2d-b257-082b09bafb14","Type":"ContainerDied","Data":"d2a5200be3bac39a3f54995b67133a5b52bf7a3d5949fc27caceb27198cda798"} Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.101749 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2a5200be3bac39a3f54995b67133a5b52bf7a3d5949fc27caceb27198cda798" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.101752 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-zjjhg" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.223292 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-b9rq5"] Feb 19 23:31:43 crc kubenswrapper[4795]: E0219 23:31:43.224950 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532484aa-8294-4c2d-b257-082b09bafb14" containerName="ceph-client-openstack-openstack-cell1" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.224987 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="532484aa-8294-4c2d-b257-082b09bafb14" containerName="ceph-client-openstack-openstack-cell1" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.226439 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="532484aa-8294-4c2d-b257-082b09bafb14" containerName="ceph-client-openstack-openstack-cell1" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.228078 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.231078 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.231110 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.231598 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.231608 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.233387 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.245588 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-b9rq5"] Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.332963 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.333076 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.333116 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bvsj\" (UniqueName: \"kubernetes.io/projected/d02efd94-2196-48fe-85d5-e2c65d186d6e-kube-api-access-8bvsj\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.333199 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ceph\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.333455 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-inventory\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.333580 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.435654 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-inventory\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.435710 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.435803 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.435837 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.435855 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bvsj\" (UniqueName: \"kubernetes.io/projected/d02efd94-2196-48fe-85d5-e2c65d186d6e-kube-api-access-8bvsj\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.435894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ceph\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.437079 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.439804 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-inventory\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.440276 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ceph\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.442116 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.444734 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.452940 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bvsj\" (UniqueName: \"kubernetes.io/projected/d02efd94-2196-48fe-85d5-e2c65d186d6e-kube-api-access-8bvsj\") pod \"ovn-openstack-openstack-cell1-b9rq5\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:43 crc kubenswrapper[4795]: I0219 23:31:43.551852 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:31:44 crc kubenswrapper[4795]: I0219 23:31:44.136892 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-b9rq5"] Feb 19 23:31:45 crc kubenswrapper[4795]: I0219 23:31:45.123515 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" event={"ID":"d02efd94-2196-48fe-85d5-e2c65d186d6e","Type":"ContainerStarted","Data":"e4c88993e055ef4e5ec56d3915d302612d8dc348bfd21bf85166259a8eaaba9e"} Feb 19 23:31:45 crc kubenswrapper[4795]: I0219 23:31:45.123880 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" event={"ID":"d02efd94-2196-48fe-85d5-e2c65d186d6e","Type":"ContainerStarted","Data":"6ea93587dfbb9217fefa8b1e6c8ed4ec98f16d4ee8f50c309e9ae240e35f3aa9"} Feb 19 23:31:45 crc kubenswrapper[4795]: I0219 23:31:45.148464 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" podStartSLOduration=1.6820629459999998 podStartE2EDuration="2.148446117s" podCreationTimestamp="2026-02-19 23:31:43 +0000 UTC" firstStartedPulling="2026-02-19 23:31:44.14700533 +0000 UTC m=+7415.339523194" lastFinishedPulling="2026-02-19 23:31:44.613388501 +0000 UTC m=+7415.805906365" observedRunningTime="2026-02-19 23:31:45.145469443 +0000 UTC m=+7416.337987307" watchObservedRunningTime="2026-02-19 23:31:45.148446117 +0000 UTC m=+7416.340963981" Feb 19 23:32:28 crc kubenswrapper[4795]: I0219 23:32:28.427672 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:32:28 crc kubenswrapper[4795]: I0219 23:32:28.428427 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:32:48 crc kubenswrapper[4795]: I0219 23:32:48.780833 4795 generic.go:334] "Generic (PLEG): container finished" podID="d02efd94-2196-48fe-85d5-e2c65d186d6e" containerID="e4c88993e055ef4e5ec56d3915d302612d8dc348bfd21bf85166259a8eaaba9e" exitCode=0 Feb 19 23:32:48 crc kubenswrapper[4795]: I0219 23:32:48.780925 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" event={"ID":"d02efd94-2196-48fe-85d5-e2c65d186d6e","Type":"ContainerDied","Data":"e4c88993e055ef4e5ec56d3915d302612d8dc348bfd21bf85166259a8eaaba9e"} Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.279049 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.403304 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bvsj\" (UniqueName: \"kubernetes.io/projected/d02efd94-2196-48fe-85d5-e2c65d186d6e-kube-api-access-8bvsj\") pod \"d02efd94-2196-48fe-85d5-e2c65d186d6e\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.403482 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ssh-key-openstack-cell1\") pod \"d02efd94-2196-48fe-85d5-e2c65d186d6e\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.403519 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovncontroller-config-0\") pod \"d02efd94-2196-48fe-85d5-e2c65d186d6e\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.403562 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ceph\") pod \"d02efd94-2196-48fe-85d5-e2c65d186d6e\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.403613 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovn-combined-ca-bundle\") pod \"d02efd94-2196-48fe-85d5-e2c65d186d6e\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.403784 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-inventory\") pod \"d02efd94-2196-48fe-85d5-e2c65d186d6e\" (UID: \"d02efd94-2196-48fe-85d5-e2c65d186d6e\") " Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.409263 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d02efd94-2196-48fe-85d5-e2c65d186d6e" (UID: "d02efd94-2196-48fe-85d5-e2c65d186d6e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.413519 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02efd94-2196-48fe-85d5-e2c65d186d6e-kube-api-access-8bvsj" (OuterVolumeSpecName: "kube-api-access-8bvsj") pod "d02efd94-2196-48fe-85d5-e2c65d186d6e" (UID: "d02efd94-2196-48fe-85d5-e2c65d186d6e"). InnerVolumeSpecName "kube-api-access-8bvsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.413515 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ceph" (OuterVolumeSpecName: "ceph") pod "d02efd94-2196-48fe-85d5-e2c65d186d6e" (UID: "d02efd94-2196-48fe-85d5-e2c65d186d6e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.430682 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "d02efd94-2196-48fe-85d5-e2c65d186d6e" (UID: "d02efd94-2196-48fe-85d5-e2c65d186d6e"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.438661 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-inventory" (OuterVolumeSpecName: "inventory") pod "d02efd94-2196-48fe-85d5-e2c65d186d6e" (UID: "d02efd94-2196-48fe-85d5-e2c65d186d6e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.439193 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d02efd94-2196-48fe-85d5-e2c65d186d6e" (UID: "d02efd94-2196-48fe-85d5-e2c65d186d6e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.506535 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.506573 4795 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.506583 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.506593 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.506602 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d02efd94-2196-48fe-85d5-e2c65d186d6e-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.506610 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bvsj\" (UniqueName: \"kubernetes.io/projected/d02efd94-2196-48fe-85d5-e2c65d186d6e-kube-api-access-8bvsj\") on node \"crc\" DevicePath \"\"" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.807604 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" event={"ID":"d02efd94-2196-48fe-85d5-e2c65d186d6e","Type":"ContainerDied","Data":"6ea93587dfbb9217fefa8b1e6c8ed4ec98f16d4ee8f50c309e9ae240e35f3aa9"} Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.807684 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ea93587dfbb9217fefa8b1e6c8ed4ec98f16d4ee8f50c309e9ae240e35f3aa9" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.807687 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-b9rq5" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.902501 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-5855g"] Feb 19 23:32:50 crc kubenswrapper[4795]: E0219 23:32:50.902992 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02efd94-2196-48fe-85d5-e2c65d186d6e" containerName="ovn-openstack-openstack-cell1" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.903006 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02efd94-2196-48fe-85d5-e2c65d186d6e" containerName="ovn-openstack-openstack-cell1" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.903251 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02efd94-2196-48fe-85d5-e2c65d186d6e" containerName="ovn-openstack-openstack-cell1" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.904091 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.906083 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.906129 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.906156 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.906995 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.908867 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.909570 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:32:50 crc kubenswrapper[4795]: I0219 23:32:50.917505 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-5855g"] Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.016453 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.017363 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.017483 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.017666 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.017813 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rphwm\" (UniqueName: \"kubernetes.io/projected/a23f1a80-1645-454d-b9cf-e039928b84cb-kube-api-access-rphwm\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.017871 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.017901 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.120342 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.120700 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rphwm\" (UniqueName: \"kubernetes.io/projected/a23f1a80-1645-454d-b9cf-e039928b84cb-kube-api-access-rphwm\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.120764 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.120794 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.120841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.120954 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.120989 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.126073 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.126272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.126281 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.127809 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.128902 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.131512 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.140052 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rphwm\" (UniqueName: \"kubernetes.io/projected/a23f1a80-1645-454d-b9cf-e039928b84cb-kube-api-access-rphwm\") pod \"neutron-metadata-openstack-openstack-cell1-5855g\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.235222 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.780084 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-5855g"] Feb 19 23:32:51 crc kubenswrapper[4795]: I0219 23:32:51.818661 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" event={"ID":"a23f1a80-1645-454d-b9cf-e039928b84cb","Type":"ContainerStarted","Data":"ac532540ed1c9ed8ee4bab62b2b0f16733f15b6c586c9842153ae0b7941ac4df"} Feb 19 23:32:52 crc kubenswrapper[4795]: I0219 23:32:52.843844 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" event={"ID":"a23f1a80-1645-454d-b9cf-e039928b84cb","Type":"ContainerStarted","Data":"2e3e3e59d5c650a24ac4dc17cdcdbd825d499979952a0af456bf13fbe15d8292"} Feb 19 23:32:52 crc kubenswrapper[4795]: I0219 23:32:52.870227 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" podStartSLOduration=2.442282574 podStartE2EDuration="2.87021085s" podCreationTimestamp="2026-02-19 23:32:50 +0000 UTC" firstStartedPulling="2026-02-19 23:32:51.788950332 +0000 UTC m=+7482.981468196" lastFinishedPulling="2026-02-19 23:32:52.216878608 +0000 UTC m=+7483.409396472" observedRunningTime="2026-02-19 23:32:52.869766637 +0000 UTC m=+7484.062284511" watchObservedRunningTime="2026-02-19 23:32:52.87021085 +0000 UTC m=+7484.062728714" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.427922 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.428906 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.580533 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nqrh8"] Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.584217 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.603469 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nqrh8"] Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.698726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-utilities\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.698816 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9w6h\" (UniqueName: \"kubernetes.io/projected/1b009444-b438-433d-8e2c-abc763e6f9ee-kube-api-access-g9w6h\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.698930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-catalog-content\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.801011 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-catalog-content\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.801488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-utilities\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.801566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9w6h\" (UniqueName: \"kubernetes.io/projected/1b009444-b438-433d-8e2c-abc763e6f9ee-kube-api-access-g9w6h\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.801669 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-catalog-content\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.802017 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-utilities\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.823032 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9w6h\" (UniqueName: \"kubernetes.io/projected/1b009444-b438-433d-8e2c-abc763e6f9ee-kube-api-access-g9w6h\") pod \"certified-operators-nqrh8\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:58 crc kubenswrapper[4795]: I0219 23:32:58.917427 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:32:59 crc kubenswrapper[4795]: I0219 23:32:59.488494 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nqrh8"] Feb 19 23:32:59 crc kubenswrapper[4795]: I0219 23:32:59.914622 4795 generic.go:334] "Generic (PLEG): container finished" podID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerID="40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91" exitCode=0 Feb 19 23:32:59 crc kubenswrapper[4795]: I0219 23:32:59.914731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqrh8" event={"ID":"1b009444-b438-433d-8e2c-abc763e6f9ee","Type":"ContainerDied","Data":"40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91"} Feb 19 23:32:59 crc kubenswrapper[4795]: I0219 23:32:59.914951 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqrh8" event={"ID":"1b009444-b438-433d-8e2c-abc763e6f9ee","Type":"ContainerStarted","Data":"90136202358e702fa7897a42d741cd8a62535cb68b9b9944a6dabad76cb7864e"} Feb 19 23:33:00 crc kubenswrapper[4795]: I0219 23:33:00.934785 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqrh8" event={"ID":"1b009444-b438-433d-8e2c-abc763e6f9ee","Type":"ContainerStarted","Data":"456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93"} Feb 19 23:33:01 crc kubenswrapper[4795]: I0219 23:33:01.946965 4795 generic.go:334] "Generic (PLEG): container finished" podID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerID="456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93" exitCode=0 Feb 19 23:33:01 crc kubenswrapper[4795]: I0219 23:33:01.947068 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqrh8" event={"ID":"1b009444-b438-433d-8e2c-abc763e6f9ee","Type":"ContainerDied","Data":"456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93"} Feb 19 23:33:02 crc kubenswrapper[4795]: I0219 23:33:02.957898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqrh8" event={"ID":"1b009444-b438-433d-8e2c-abc763e6f9ee","Type":"ContainerStarted","Data":"b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935"} Feb 19 23:33:02 crc kubenswrapper[4795]: I0219 23:33:02.984624 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nqrh8" podStartSLOduration=2.537221318 podStartE2EDuration="4.984606414s" podCreationTimestamp="2026-02-19 23:32:58 +0000 UTC" firstStartedPulling="2026-02-19 23:32:59.917032981 +0000 UTC m=+7491.109550845" lastFinishedPulling="2026-02-19 23:33:02.364418077 +0000 UTC m=+7493.556935941" observedRunningTime="2026-02-19 23:33:02.976951588 +0000 UTC m=+7494.169469462" watchObservedRunningTime="2026-02-19 23:33:02.984606414 +0000 UTC m=+7494.177124278" Feb 19 23:33:08 crc kubenswrapper[4795]: I0219 23:33:08.918870 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:33:08 crc kubenswrapper[4795]: I0219 23:33:08.919682 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:33:08 crc kubenswrapper[4795]: I0219 23:33:08.987400 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:33:09 crc kubenswrapper[4795]: I0219 23:33:09.073416 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:33:09 crc kubenswrapper[4795]: I0219 23:33:09.225219 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nqrh8"] Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.042496 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nqrh8" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerName="registry-server" containerID="cri-o://b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935" gracePeriod=2 Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.590182 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.695448 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-utilities\") pod \"1b009444-b438-433d-8e2c-abc763e6f9ee\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.695515 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9w6h\" (UniqueName: \"kubernetes.io/projected/1b009444-b438-433d-8e2c-abc763e6f9ee-kube-api-access-g9w6h\") pod \"1b009444-b438-433d-8e2c-abc763e6f9ee\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.695551 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-catalog-content\") pod \"1b009444-b438-433d-8e2c-abc763e6f9ee\" (UID: \"1b009444-b438-433d-8e2c-abc763e6f9ee\") " Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.697304 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-utilities" (OuterVolumeSpecName: "utilities") pod "1b009444-b438-433d-8e2c-abc763e6f9ee" (UID: "1b009444-b438-433d-8e2c-abc763e6f9ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.703181 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b009444-b438-433d-8e2c-abc763e6f9ee-kube-api-access-g9w6h" (OuterVolumeSpecName: "kube-api-access-g9w6h") pod "1b009444-b438-433d-8e2c-abc763e6f9ee" (UID: "1b009444-b438-433d-8e2c-abc763e6f9ee"). InnerVolumeSpecName "kube-api-access-g9w6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.745139 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b009444-b438-433d-8e2c-abc763e6f9ee" (UID: "1b009444-b438-433d-8e2c-abc763e6f9ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.798663 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.798702 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9w6h\" (UniqueName: \"kubernetes.io/projected/1b009444-b438-433d-8e2c-abc763e6f9ee-kube-api-access-g9w6h\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:11 crc kubenswrapper[4795]: I0219 23:33:11.798719 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b009444-b438-433d-8e2c-abc763e6f9ee-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.054839 4795 generic.go:334] "Generic (PLEG): container finished" podID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerID="b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935" exitCode=0 Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.054890 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nqrh8" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.054886 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqrh8" event={"ID":"1b009444-b438-433d-8e2c-abc763e6f9ee","Type":"ContainerDied","Data":"b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935"} Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.055314 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqrh8" event={"ID":"1b009444-b438-433d-8e2c-abc763e6f9ee","Type":"ContainerDied","Data":"90136202358e702fa7897a42d741cd8a62535cb68b9b9944a6dabad76cb7864e"} Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.055339 4795 scope.go:117] "RemoveContainer" containerID="b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.076286 4795 scope.go:117] "RemoveContainer" containerID="456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.096891 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nqrh8"] Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.105625 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nqrh8"] Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.126629 4795 scope.go:117] "RemoveContainer" containerID="40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.153359 4795 scope.go:117] "RemoveContainer" containerID="b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935" Feb 19 23:33:12 crc kubenswrapper[4795]: E0219 23:33:12.153806 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935\": container with ID starting with b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935 not found: ID does not exist" containerID="b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.153864 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935"} err="failed to get container status \"b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935\": rpc error: code = NotFound desc = could not find container \"b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935\": container with ID starting with b01a941fb86e6ae179a993966f808b1259b5941731dc4bfeff98b21fbe400935 not found: ID does not exist" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.153886 4795 scope.go:117] "RemoveContainer" containerID="456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93" Feb 19 23:33:12 crc kubenswrapper[4795]: E0219 23:33:12.154204 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93\": container with ID starting with 456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93 not found: ID does not exist" containerID="456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.154222 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93"} err="failed to get container status \"456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93\": rpc error: code = NotFound desc = could not find container \"456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93\": container with ID starting with 456e4cdd0fe6f92610c5ca2c4002e956bc698f4747fe3e78727e6f5a34ddfe93 not found: ID does not exist" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.154236 4795 scope.go:117] "RemoveContainer" containerID="40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91" Feb 19 23:33:12 crc kubenswrapper[4795]: E0219 23:33:12.154645 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91\": container with ID starting with 40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91 not found: ID does not exist" containerID="40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91" Feb 19 23:33:12 crc kubenswrapper[4795]: I0219 23:33:12.154669 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91"} err="failed to get container status \"40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91\": rpc error: code = NotFound desc = could not find container \"40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91\": container with ID starting with 40ff688b57b898b0b5419dec2eb7dae2f53df75e9918f4af50555fc1c62cce91 not found: ID does not exist" Feb 19 23:33:13 crc kubenswrapper[4795]: I0219 23:33:13.525727 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" path="/var/lib/kubelet/pods/1b009444-b438-433d-8e2c-abc763e6f9ee/volumes" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.032614 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f54zd"] Feb 19 23:33:16 crc kubenswrapper[4795]: E0219 23:33:16.033903 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerName="extract-utilities" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.033918 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerName="extract-utilities" Feb 19 23:33:16 crc kubenswrapper[4795]: E0219 23:33:16.033945 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerName="registry-server" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.033951 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerName="registry-server" Feb 19 23:33:16 crc kubenswrapper[4795]: E0219 23:33:16.033971 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerName="extract-content" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.033977 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerName="extract-content" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.034238 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b009444-b438-433d-8e2c-abc763e6f9ee" containerName="registry-server" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.035707 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.053421 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f54zd"] Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.111116 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-utilities\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.111334 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-catalog-content\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.111422 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2wff\" (UniqueName: \"kubernetes.io/projected/32a0d5fa-fc7f-4107-807b-633943866132-kube-api-access-l2wff\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.213722 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2wff\" (UniqueName: \"kubernetes.io/projected/32a0d5fa-fc7f-4107-807b-633943866132-kube-api-access-l2wff\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.213887 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-utilities\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.213985 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-catalog-content\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.214527 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-catalog-content\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.214865 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-utilities\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.243561 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2wff\" (UniqueName: \"kubernetes.io/projected/32a0d5fa-fc7f-4107-807b-633943866132-kube-api-access-l2wff\") pod \"redhat-operators-f54zd\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.369917 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:16 crc kubenswrapper[4795]: I0219 23:33:16.890972 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f54zd"] Feb 19 23:33:17 crc kubenswrapper[4795]: I0219 23:33:17.108273 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f54zd" event={"ID":"32a0d5fa-fc7f-4107-807b-633943866132","Type":"ContainerStarted","Data":"cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579"} Feb 19 23:33:17 crc kubenswrapper[4795]: I0219 23:33:17.108331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f54zd" event={"ID":"32a0d5fa-fc7f-4107-807b-633943866132","Type":"ContainerStarted","Data":"c64e95032c64e4397cad5beadc6695dedca6a054e73613870cbcf23737ede0cd"} Feb 19 23:33:18 crc kubenswrapper[4795]: I0219 23:33:18.118307 4795 generic.go:334] "Generic (PLEG): container finished" podID="32a0d5fa-fc7f-4107-807b-633943866132" containerID="cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579" exitCode=0 Feb 19 23:33:18 crc kubenswrapper[4795]: I0219 23:33:18.118375 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f54zd" event={"ID":"32a0d5fa-fc7f-4107-807b-633943866132","Type":"ContainerDied","Data":"cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579"} Feb 19 23:33:20 crc kubenswrapper[4795]: I0219 23:33:20.142319 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f54zd" event={"ID":"32a0d5fa-fc7f-4107-807b-633943866132","Type":"ContainerStarted","Data":"8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671"} Feb 19 23:33:24 crc kubenswrapper[4795]: I0219 23:33:24.186065 4795 generic.go:334] "Generic (PLEG): container finished" podID="32a0d5fa-fc7f-4107-807b-633943866132" containerID="8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671" exitCode=0 Feb 19 23:33:24 crc kubenswrapper[4795]: I0219 23:33:24.186139 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f54zd" event={"ID":"32a0d5fa-fc7f-4107-807b-633943866132","Type":"ContainerDied","Data":"8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671"} Feb 19 23:33:25 crc kubenswrapper[4795]: I0219 23:33:25.198995 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f54zd" event={"ID":"32a0d5fa-fc7f-4107-807b-633943866132","Type":"ContainerStarted","Data":"1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b"} Feb 19 23:33:25 crc kubenswrapper[4795]: I0219 23:33:25.238024 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f54zd" podStartSLOduration=2.7452607479999998 podStartE2EDuration="9.238004116s" podCreationTimestamp="2026-02-19 23:33:16 +0000 UTC" firstStartedPulling="2026-02-19 23:33:18.120817122 +0000 UTC m=+7509.313334976" lastFinishedPulling="2026-02-19 23:33:24.61356044 +0000 UTC m=+7515.806078344" observedRunningTime="2026-02-19 23:33:25.222687054 +0000 UTC m=+7516.415204948" watchObservedRunningTime="2026-02-19 23:33:25.238004116 +0000 UTC m=+7516.430522000" Feb 19 23:33:26 crc kubenswrapper[4795]: I0219 23:33:26.370034 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:26 crc kubenswrapper[4795]: I0219 23:33:26.370795 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:27 crc kubenswrapper[4795]: I0219 23:33:27.423542 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f54zd" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="registry-server" probeResult="failure" output=< Feb 19 23:33:27 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:33:27 crc kubenswrapper[4795]: > Feb 19 23:33:28 crc kubenswrapper[4795]: I0219 23:33:28.427658 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:33:28 crc kubenswrapper[4795]: I0219 23:33:28.427739 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:33:28 crc kubenswrapper[4795]: I0219 23:33:28.427803 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:33:28 crc kubenswrapper[4795]: I0219 23:33:28.428721 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"53ab375a2fc3fa1ef244b68bab3723d6db871a1c5fc51b2d8dec9ed6289bc190"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:33:28 crc kubenswrapper[4795]: I0219 23:33:28.428827 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://53ab375a2fc3fa1ef244b68bab3723d6db871a1c5fc51b2d8dec9ed6289bc190" gracePeriod=600 Feb 19 23:33:29 crc kubenswrapper[4795]: I0219 23:33:29.242988 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="53ab375a2fc3fa1ef244b68bab3723d6db871a1c5fc51b2d8dec9ed6289bc190" exitCode=0 Feb 19 23:33:29 crc kubenswrapper[4795]: I0219 23:33:29.243058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"53ab375a2fc3fa1ef244b68bab3723d6db871a1c5fc51b2d8dec9ed6289bc190"} Feb 19 23:33:29 crc kubenswrapper[4795]: I0219 23:33:29.243635 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f"} Feb 19 23:33:29 crc kubenswrapper[4795]: I0219 23:33:29.243654 4795 scope.go:117] "RemoveContainer" containerID="3901cb7ebf5348e8e5959a203f3e6450ab7f3f57a2848ec56227b7861716ef02" Feb 19 23:33:36 crc kubenswrapper[4795]: I0219 23:33:36.449391 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:36 crc kubenswrapper[4795]: I0219 23:33:36.502665 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:36 crc kubenswrapper[4795]: I0219 23:33:36.808281 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f54zd"] Feb 19 23:33:38 crc kubenswrapper[4795]: I0219 23:33:38.342033 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f54zd" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="registry-server" containerID="cri-o://1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b" gracePeriod=2 Feb 19 23:33:38 crc kubenswrapper[4795]: I0219 23:33:38.906389 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.070712 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-utilities\") pod \"32a0d5fa-fc7f-4107-807b-633943866132\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.070771 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2wff\" (UniqueName: \"kubernetes.io/projected/32a0d5fa-fc7f-4107-807b-633943866132-kube-api-access-l2wff\") pod \"32a0d5fa-fc7f-4107-807b-633943866132\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.070863 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-catalog-content\") pod \"32a0d5fa-fc7f-4107-807b-633943866132\" (UID: \"32a0d5fa-fc7f-4107-807b-633943866132\") " Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.071626 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-utilities" (OuterVolumeSpecName: "utilities") pod "32a0d5fa-fc7f-4107-807b-633943866132" (UID: "32a0d5fa-fc7f-4107-807b-633943866132"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.077063 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a0d5fa-fc7f-4107-807b-633943866132-kube-api-access-l2wff" (OuterVolumeSpecName: "kube-api-access-l2wff") pod "32a0d5fa-fc7f-4107-807b-633943866132" (UID: "32a0d5fa-fc7f-4107-807b-633943866132"). InnerVolumeSpecName "kube-api-access-l2wff". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.173529 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.173565 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2wff\" (UniqueName: \"kubernetes.io/projected/32a0d5fa-fc7f-4107-807b-633943866132-kube-api-access-l2wff\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.217632 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32a0d5fa-fc7f-4107-807b-633943866132" (UID: "32a0d5fa-fc7f-4107-807b-633943866132"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.275374 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32a0d5fa-fc7f-4107-807b-633943866132-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.355387 4795 generic.go:334] "Generic (PLEG): container finished" podID="32a0d5fa-fc7f-4107-807b-633943866132" containerID="1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b" exitCode=0 Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.355433 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f54zd" event={"ID":"32a0d5fa-fc7f-4107-807b-633943866132","Type":"ContainerDied","Data":"1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b"} Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.355439 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f54zd" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.355468 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f54zd" event={"ID":"32a0d5fa-fc7f-4107-807b-633943866132","Type":"ContainerDied","Data":"c64e95032c64e4397cad5beadc6695dedca6a054e73613870cbcf23737ede0cd"} Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.355491 4795 scope.go:117] "RemoveContainer" containerID="1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.379320 4795 scope.go:117] "RemoveContainer" containerID="8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.399893 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f54zd"] Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.408376 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f54zd"] Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.430149 4795 scope.go:117] "RemoveContainer" containerID="cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.472153 4795 scope.go:117] "RemoveContainer" containerID="1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b" Feb 19 23:33:39 crc kubenswrapper[4795]: E0219 23:33:39.473021 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b\": container with ID starting with 1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b not found: ID does not exist" containerID="1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.473081 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b"} err="failed to get container status \"1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b\": rpc error: code = NotFound desc = could not find container \"1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b\": container with ID starting with 1c94231c1ab58f24440993ae6d81e8caccc25d13e1fe573e94326ba8e1dab56b not found: ID does not exist" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.473116 4795 scope.go:117] "RemoveContainer" containerID="8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671" Feb 19 23:33:39 crc kubenswrapper[4795]: E0219 23:33:39.473441 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671\": container with ID starting with 8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671 not found: ID does not exist" containerID="8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.473474 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671"} err="failed to get container status \"8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671\": rpc error: code = NotFound desc = could not find container \"8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671\": container with ID starting with 8af5e4ebc1fb66d082450eb8a8352c069d289c2c0be9e5e62fcdbbd6f386e671 not found: ID does not exist" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.473494 4795 scope.go:117] "RemoveContainer" containerID="cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579" Feb 19 23:33:39 crc kubenswrapper[4795]: E0219 23:33:39.473725 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579\": container with ID starting with cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579 not found: ID does not exist" containerID="cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.473766 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579"} err="failed to get container status \"cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579\": rpc error: code = NotFound desc = could not find container \"cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579\": container with ID starting with cbde4ac654107bdb2c52eb88bfbc789c7cd281346e25fb6499981fa8657af579 not found: ID does not exist" Feb 19 23:33:39 crc kubenswrapper[4795]: I0219 23:33:39.527824 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a0d5fa-fc7f-4107-807b-633943866132" path="/var/lib/kubelet/pods/32a0d5fa-fc7f-4107-807b-633943866132/volumes" Feb 19 23:33:43 crc kubenswrapper[4795]: I0219 23:33:43.398472 4795 generic.go:334] "Generic (PLEG): container finished" podID="a23f1a80-1645-454d-b9cf-e039928b84cb" containerID="2e3e3e59d5c650a24ac4dc17cdcdbd825d499979952a0af456bf13fbe15d8292" exitCode=0 Feb 19 23:33:43 crc kubenswrapper[4795]: I0219 23:33:43.398554 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" event={"ID":"a23f1a80-1645-454d-b9cf-e039928b84cb","Type":"ContainerDied","Data":"2e3e3e59d5c650a24ac4dc17cdcdbd825d499979952a0af456bf13fbe15d8292"} Feb 19 23:33:44 crc kubenswrapper[4795]: I0219 23:33:44.884653 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:33:44 crc kubenswrapper[4795]: I0219 23:33:44.995501 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-metadata-combined-ca-bundle\") pod \"a23f1a80-1645-454d-b9cf-e039928b84cb\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " Feb 19 23:33:44 crc kubenswrapper[4795]: I0219 23:33:44.995573 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-inventory\") pod \"a23f1a80-1645-454d-b9cf-e039928b84cb\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " Feb 19 23:33:44 crc kubenswrapper[4795]: I0219 23:33:44.995594 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ceph\") pod \"a23f1a80-1645-454d-b9cf-e039928b84cb\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " Feb 19 23:33:44 crc kubenswrapper[4795]: I0219 23:33:44.995625 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ssh-key-openstack-cell1\") pod \"a23f1a80-1645-454d-b9cf-e039928b84cb\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " Feb 19 23:33:44 crc kubenswrapper[4795]: I0219 23:33:44.995702 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"a23f1a80-1645-454d-b9cf-e039928b84cb\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " Feb 19 23:33:44 crc kubenswrapper[4795]: I0219 23:33:44.995719 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-nova-metadata-neutron-config-0\") pod \"a23f1a80-1645-454d-b9cf-e039928b84cb\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " Feb 19 23:33:44 crc kubenswrapper[4795]: I0219 23:33:44.995893 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rphwm\" (UniqueName: \"kubernetes.io/projected/a23f1a80-1645-454d-b9cf-e039928b84cb-kube-api-access-rphwm\") pod \"a23f1a80-1645-454d-b9cf-e039928b84cb\" (UID: \"a23f1a80-1645-454d-b9cf-e039928b84cb\") " Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.001913 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a23f1a80-1645-454d-b9cf-e039928b84cb-kube-api-access-rphwm" (OuterVolumeSpecName: "kube-api-access-rphwm") pod "a23f1a80-1645-454d-b9cf-e039928b84cb" (UID: "a23f1a80-1645-454d-b9cf-e039928b84cb"). InnerVolumeSpecName "kube-api-access-rphwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.003452 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a23f1a80-1645-454d-b9cf-e039928b84cb" (UID: "a23f1a80-1645-454d-b9cf-e039928b84cb"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.020429 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ceph" (OuterVolumeSpecName: "ceph") pod "a23f1a80-1645-454d-b9cf-e039928b84cb" (UID: "a23f1a80-1645-454d-b9cf-e039928b84cb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.029144 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "a23f1a80-1645-454d-b9cf-e039928b84cb" (UID: "a23f1a80-1645-454d-b9cf-e039928b84cb"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.032823 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a23f1a80-1645-454d-b9cf-e039928b84cb" (UID: "a23f1a80-1645-454d-b9cf-e039928b84cb"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.035229 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-inventory" (OuterVolumeSpecName: "inventory") pod "a23f1a80-1645-454d-b9cf-e039928b84cb" (UID: "a23f1a80-1645-454d-b9cf-e039928b84cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.050990 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "a23f1a80-1645-454d-b9cf-e039928b84cb" (UID: "a23f1a80-1645-454d-b9cf-e039928b84cb"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.098842 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.098878 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.098889 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.098899 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.098909 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.098920 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rphwm\" (UniqueName: \"kubernetes.io/projected/a23f1a80-1645-454d-b9cf-e039928b84cb-kube-api-access-rphwm\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.098929 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a23f1a80-1645-454d-b9cf-e039928b84cb-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.419031 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" event={"ID":"a23f1a80-1645-454d-b9cf-e039928b84cb","Type":"ContainerDied","Data":"ac532540ed1c9ed8ee4bab62b2b0f16733f15b6c586c9842153ae0b7941ac4df"} Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.419085 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac532540ed1c9ed8ee4bab62b2b0f16733f15b6c586c9842153ae0b7941ac4df" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.419099 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-5855g" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.540604 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-drll7"] Feb 19 23:33:45 crc kubenswrapper[4795]: E0219 23:33:45.541626 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="registry-server" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.541660 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="registry-server" Feb 19 23:33:45 crc kubenswrapper[4795]: E0219 23:33:45.541700 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="extract-utilities" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.541719 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="extract-utilities" Feb 19 23:33:45 crc kubenswrapper[4795]: E0219 23:33:45.541802 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="extract-content" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.541821 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="extract-content" Feb 19 23:33:45 crc kubenswrapper[4795]: E0219 23:33:45.541860 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23f1a80-1645-454d-b9cf-e039928b84cb" containerName="neutron-metadata-openstack-openstack-cell1" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.541881 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23f1a80-1645-454d-b9cf-e039928b84cb" containerName="neutron-metadata-openstack-openstack-cell1" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.545784 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a0d5fa-fc7f-4107-807b-633943866132" containerName="registry-server" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.545890 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a23f1a80-1645-454d-b9cf-e039928b84cb" containerName="neutron-metadata-openstack-openstack-cell1" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.547617 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-drll7"] Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.547760 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.558850 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.559012 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.559148 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.559252 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.559393 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.716266 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd8nf\" (UniqueName: \"kubernetes.io/projected/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-kube-api-access-rd8nf\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.716321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ceph\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.716347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.716416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.716455 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.717099 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-inventory\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.819427 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-inventory\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.819541 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd8nf\" (UniqueName: \"kubernetes.io/projected/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-kube-api-access-rd8nf\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.819589 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ceph\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.819626 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.819664 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.819688 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.825713 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-inventory\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.825824 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.827134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.828448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ceph\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.830016 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.843847 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd8nf\" (UniqueName: \"kubernetes.io/projected/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-kube-api-access-rd8nf\") pod \"libvirt-openstack-openstack-cell1-drll7\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:45 crc kubenswrapper[4795]: I0219 23:33:45.875823 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:33:46 crc kubenswrapper[4795]: I0219 23:33:46.470454 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-drll7"] Feb 19 23:33:47 crc kubenswrapper[4795]: I0219 23:33:47.445060 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-drll7" event={"ID":"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d","Type":"ContainerStarted","Data":"3fb8f68f6db9466d9e5013e9df140faafbadc024d0aebac89bdac3d89056c34b"} Feb 19 23:33:47 crc kubenswrapper[4795]: I0219 23:33:47.445876 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-drll7" event={"ID":"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d","Type":"ContainerStarted","Data":"340270680cada990d6856771e8b48e114d5f75d179a6b65f804e0367442e7b18"} Feb 19 23:33:47 crc kubenswrapper[4795]: I0219 23:33:47.467729 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-drll7" podStartSLOduration=2.031218623 podStartE2EDuration="2.467708361s" podCreationTimestamp="2026-02-19 23:33:45 +0000 UTC" firstStartedPulling="2026-02-19 23:33:46.481866994 +0000 UTC m=+7537.674384858" lastFinishedPulling="2026-02-19 23:33:46.918356732 +0000 UTC m=+7538.110874596" observedRunningTime="2026-02-19 23:33:47.465338784 +0000 UTC m=+7538.657856688" watchObservedRunningTime="2026-02-19 23:33:47.467708361 +0000 UTC m=+7538.660226225" Feb 19 23:35:28 crc kubenswrapper[4795]: I0219 23:35:28.427648 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:35:28 crc kubenswrapper[4795]: I0219 23:35:28.428134 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.086099 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hcqms"] Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.091700 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.120250 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcqms"] Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.262040 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-utilities\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.262257 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s76d4\" (UniqueName: \"kubernetes.io/projected/92699807-f31f-4ef1-80a3-c85b5ae52267-kube-api-access-s76d4\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.262372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-catalog-content\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.364776 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-utilities\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.364832 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s76d4\" (UniqueName: \"kubernetes.io/projected/92699807-f31f-4ef1-80a3-c85b5ae52267-kube-api-access-s76d4\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.364891 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-catalog-content\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.365514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-catalog-content\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.365770 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-utilities\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.395735 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s76d4\" (UniqueName: \"kubernetes.io/projected/92699807-f31f-4ef1-80a3-c85b5ae52267-kube-api-access-s76d4\") pod \"redhat-marketplace-hcqms\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.422869 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:35:53 crc kubenswrapper[4795]: I0219 23:35:53.902302 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcqms"] Feb 19 23:35:54 crc kubenswrapper[4795]: I0219 23:35:54.777498 4795 generic.go:334] "Generic (PLEG): container finished" podID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerID="075ff9636f71ea6b36cce9312e631fc0c64c7440971ccbcdb446ae2d249ac68d" exitCode=0 Feb 19 23:35:54 crc kubenswrapper[4795]: I0219 23:35:54.777584 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcqms" event={"ID":"92699807-f31f-4ef1-80a3-c85b5ae52267","Type":"ContainerDied","Data":"075ff9636f71ea6b36cce9312e631fc0c64c7440971ccbcdb446ae2d249ac68d"} Feb 19 23:35:54 crc kubenswrapper[4795]: I0219 23:35:54.777835 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcqms" event={"ID":"92699807-f31f-4ef1-80a3-c85b5ae52267","Type":"ContainerStarted","Data":"c8819ed89cfef249f5f7927f5e7d5d89c50e025a9ea87be413c8b7593d29b7f2"} Feb 19 23:35:54 crc kubenswrapper[4795]: I0219 23:35:54.780185 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:35:55 crc kubenswrapper[4795]: I0219 23:35:55.788948 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcqms" event={"ID":"92699807-f31f-4ef1-80a3-c85b5ae52267","Type":"ContainerStarted","Data":"bc4e8940e2128aa71b442ef79e89ec98fa8c8ba519f41bdd1d4f14b52325a51b"} Feb 19 23:35:56 crc kubenswrapper[4795]: I0219 23:35:56.806337 4795 generic.go:334] "Generic (PLEG): container finished" podID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerID="bc4e8940e2128aa71b442ef79e89ec98fa8c8ba519f41bdd1d4f14b52325a51b" exitCode=0 Feb 19 23:35:56 crc kubenswrapper[4795]: I0219 23:35:56.806396 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcqms" event={"ID":"92699807-f31f-4ef1-80a3-c85b5ae52267","Type":"ContainerDied","Data":"bc4e8940e2128aa71b442ef79e89ec98fa8c8ba519f41bdd1d4f14b52325a51b"} Feb 19 23:35:57 crc kubenswrapper[4795]: I0219 23:35:57.820399 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcqms" event={"ID":"92699807-f31f-4ef1-80a3-c85b5ae52267","Type":"ContainerStarted","Data":"f30a680eb80ce693ac8d0b06f622aff4dfe172c9fef0f9ac28f9a5b3bf27d44a"} Feb 19 23:35:57 crc kubenswrapper[4795]: I0219 23:35:57.843827 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hcqms" podStartSLOduration=2.411270578 podStartE2EDuration="4.843807916s" podCreationTimestamp="2026-02-19 23:35:53 +0000 UTC" firstStartedPulling="2026-02-19 23:35:54.779891316 +0000 UTC m=+7665.972409180" lastFinishedPulling="2026-02-19 23:35:57.212428644 +0000 UTC m=+7668.404946518" observedRunningTime="2026-02-19 23:35:57.836563601 +0000 UTC m=+7669.029081515" watchObservedRunningTime="2026-02-19 23:35:57.843807916 +0000 UTC m=+7669.036325770" Feb 19 23:35:58 crc kubenswrapper[4795]: I0219 23:35:58.427298 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:35:58 crc kubenswrapper[4795]: I0219 23:35:58.427352 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:36:03 crc kubenswrapper[4795]: I0219 23:36:03.424018 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:36:03 crc kubenswrapper[4795]: I0219 23:36:03.424611 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:36:03 crc kubenswrapper[4795]: I0219 23:36:03.482059 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:36:03 crc kubenswrapper[4795]: I0219 23:36:03.919157 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:36:03 crc kubenswrapper[4795]: I0219 23:36:03.988448 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcqms"] Feb 19 23:36:05 crc kubenswrapper[4795]: I0219 23:36:05.892876 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hcqms" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerName="registry-server" containerID="cri-o://f30a680eb80ce693ac8d0b06f622aff4dfe172c9fef0f9ac28f9a5b3bf27d44a" gracePeriod=2 Feb 19 23:36:06 crc kubenswrapper[4795]: I0219 23:36:06.908606 4795 generic.go:334] "Generic (PLEG): container finished" podID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerID="f30a680eb80ce693ac8d0b06f622aff4dfe172c9fef0f9ac28f9a5b3bf27d44a" exitCode=0 Feb 19 23:36:06 crc kubenswrapper[4795]: I0219 23:36:06.908750 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcqms" event={"ID":"92699807-f31f-4ef1-80a3-c85b5ae52267","Type":"ContainerDied","Data":"f30a680eb80ce693ac8d0b06f622aff4dfe172c9fef0f9ac28f9a5b3bf27d44a"} Feb 19 23:36:06 crc kubenswrapper[4795]: I0219 23:36:06.909140 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hcqms" event={"ID":"92699807-f31f-4ef1-80a3-c85b5ae52267","Type":"ContainerDied","Data":"c8819ed89cfef249f5f7927f5e7d5d89c50e025a9ea87be413c8b7593d29b7f2"} Feb 19 23:36:06 crc kubenswrapper[4795]: I0219 23:36:06.909157 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8819ed89cfef249f5f7927f5e7d5d89c50e025a9ea87be413c8b7593d29b7f2" Feb 19 23:36:06 crc kubenswrapper[4795]: I0219 23:36:06.926748 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.064311 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-catalog-content\") pod \"92699807-f31f-4ef1-80a3-c85b5ae52267\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.064423 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s76d4\" (UniqueName: \"kubernetes.io/projected/92699807-f31f-4ef1-80a3-c85b5ae52267-kube-api-access-s76d4\") pod \"92699807-f31f-4ef1-80a3-c85b5ae52267\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.064726 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-utilities\") pod \"92699807-f31f-4ef1-80a3-c85b5ae52267\" (UID: \"92699807-f31f-4ef1-80a3-c85b5ae52267\") " Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.066462 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-utilities" (OuterVolumeSpecName: "utilities") pod "92699807-f31f-4ef1-80a3-c85b5ae52267" (UID: "92699807-f31f-4ef1-80a3-c85b5ae52267"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.076517 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92699807-f31f-4ef1-80a3-c85b5ae52267-kube-api-access-s76d4" (OuterVolumeSpecName: "kube-api-access-s76d4") pod "92699807-f31f-4ef1-80a3-c85b5ae52267" (UID: "92699807-f31f-4ef1-80a3-c85b5ae52267"). InnerVolumeSpecName "kube-api-access-s76d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.086938 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92699807-f31f-4ef1-80a3-c85b5ae52267" (UID: "92699807-f31f-4ef1-80a3-c85b5ae52267"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.167457 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.167485 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s76d4\" (UniqueName: \"kubernetes.io/projected/92699807-f31f-4ef1-80a3-c85b5ae52267-kube-api-access-s76d4\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.167495 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92699807-f31f-4ef1-80a3-c85b5ae52267-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.923212 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hcqms" Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.962131 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcqms"] Feb 19 23:36:07 crc kubenswrapper[4795]: I0219 23:36:07.980428 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hcqms"] Feb 19 23:36:09 crc kubenswrapper[4795]: I0219 23:36:09.537388 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" path="/var/lib/kubelet/pods/92699807-f31f-4ef1-80a3-c85b5ae52267/volumes" Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.427028 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.427598 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.427643 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.428500 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.428564 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" gracePeriod=600 Feb 19 23:36:28 crc kubenswrapper[4795]: E0219 23:36:28.552588 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.736676 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" exitCode=0 Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.736734 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f"} Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.738131 4795 scope.go:117] "RemoveContainer" containerID="53ab375a2fc3fa1ef244b68bab3723d6db871a1c5fc51b2d8dec9ed6289bc190" Feb 19 23:36:28 crc kubenswrapper[4795]: I0219 23:36:28.738885 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:36:28 crc kubenswrapper[4795]: E0219 23:36:28.739201 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:36:41 crc kubenswrapper[4795]: I0219 23:36:41.511553 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:36:41 crc kubenswrapper[4795]: E0219 23:36:41.512658 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:36:56 crc kubenswrapper[4795]: I0219 23:36:56.512393 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:36:56 crc kubenswrapper[4795]: E0219 23:36:56.514507 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:37:09 crc kubenswrapper[4795]: I0219 23:37:09.520733 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:37:09 crc kubenswrapper[4795]: E0219 23:37:09.521956 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:37:20 crc kubenswrapper[4795]: I0219 23:37:20.512473 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:37:20 crc kubenswrapper[4795]: E0219 23:37:20.513400 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:37:32 crc kubenswrapper[4795]: I0219 23:37:32.513463 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:37:32 crc kubenswrapper[4795]: E0219 23:37:32.514258 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:37:43 crc kubenswrapper[4795]: I0219 23:37:43.512389 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:37:43 crc kubenswrapper[4795]: E0219 23:37:43.513543 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:37:56 crc kubenswrapper[4795]: I0219 23:37:56.512492 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:37:56 crc kubenswrapper[4795]: E0219 23:37:56.513781 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:38:10 crc kubenswrapper[4795]: I0219 23:38:10.512594 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:38:10 crc kubenswrapper[4795]: E0219 23:38:10.513570 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:38:13 crc kubenswrapper[4795]: I0219 23:38:13.815435 4795 generic.go:334] "Generic (PLEG): container finished" podID="b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" containerID="3fb8f68f6db9466d9e5013e9df140faafbadc024d0aebac89bdac3d89056c34b" exitCode=0 Feb 19 23:38:13 crc kubenswrapper[4795]: I0219 23:38:13.815595 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-drll7" event={"ID":"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d","Type":"ContainerDied","Data":"3fb8f68f6db9466d9e5013e9df140faafbadc024d0aebac89bdac3d89056c34b"} Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.369901 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.534960 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ceph\") pod \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.535020 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-inventory\") pod \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.535066 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-combined-ca-bundle\") pod \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.535102 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-secret-0\") pod \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.535230 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd8nf\" (UniqueName: \"kubernetes.io/projected/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-kube-api-access-rd8nf\") pod \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.535295 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ssh-key-openstack-cell1\") pod \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\" (UID: \"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d\") " Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.541380 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" (UID: "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.541474 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-kube-api-access-rd8nf" (OuterVolumeSpecName: "kube-api-access-rd8nf") pod "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" (UID: "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d"). InnerVolumeSpecName "kube-api-access-rd8nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.548472 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ceph" (OuterVolumeSpecName: "ceph") pod "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" (UID: "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.565425 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" (UID: "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.575498 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-inventory" (OuterVolumeSpecName: "inventory") pod "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" (UID: "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.578302 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" (UID: "b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.640420 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.640872 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.641130 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.641355 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.641762 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd8nf\" (UniqueName: \"kubernetes.io/projected/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-kube-api-access-rd8nf\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.641900 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.840327 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-drll7" event={"ID":"b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d","Type":"ContainerDied","Data":"340270680cada990d6856771e8b48e114d5f75d179a6b65f804e0367442e7b18"} Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.840406 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="340270680cada990d6856771e8b48e114d5f75d179a6b65f804e0367442e7b18" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.840471 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-drll7" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.948333 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-q26dr"] Feb 19 23:38:15 crc kubenswrapper[4795]: E0219 23:38:15.949105 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerName="extract-content" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.949126 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerName="extract-content" Feb 19 23:38:15 crc kubenswrapper[4795]: E0219 23:38:15.949135 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerName="registry-server" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.949214 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerName="registry-server" Feb 19 23:38:15 crc kubenswrapper[4795]: E0219 23:38:15.949252 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" containerName="libvirt-openstack-openstack-cell1" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.949261 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" containerName="libvirt-openstack-openstack-cell1" Feb 19 23:38:15 crc kubenswrapper[4795]: E0219 23:38:15.949279 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerName="extract-utilities" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.949285 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerName="extract-utilities" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.949474 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="92699807-f31f-4ef1-80a3-c85b5ae52267" containerName="registry-server" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.949488 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d" containerName="libvirt-openstack-openstack-cell1" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.950245 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.956521 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.956735 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.956859 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.957011 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.957596 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.957866 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.958411 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:38:15 crc kubenswrapper[4795]: I0219 23:38:15.976202 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-q26dr"] Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051141 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051382 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051526 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-inventory\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051576 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051668 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ceph\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051705 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051757 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051827 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.051898 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2sgz\" (UniqueName: \"kubernetes.io/projected/df818d88-cec5-4daf-8b17-cc4bb298b498-kube-api-access-t2sgz\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.052020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.052208 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154246 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154352 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154402 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154481 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-inventory\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154579 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ceph\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154619 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154739 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154794 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2sgz\" (UniqueName: \"kubernetes.io/projected/df818d88-cec5-4daf-8b17-cc4bb298b498-kube-api-access-t2sgz\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.154860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.156682 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.157003 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.161029 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-inventory\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.161033 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ceph\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.161498 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.161816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.161844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.162893 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.163484 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.166416 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.166824 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.167035 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.179610 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2sgz\" (UniqueName: \"kubernetes.io/projected/df818d88-cec5-4daf-8b17-cc4bb298b498-kube-api-access-t2sgz\") pod \"nova-cell1-openstack-openstack-cell1-q26dr\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.283826 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:38:16 crc kubenswrapper[4795]: I0219 23:38:16.873229 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-q26dr"] Feb 19 23:38:17 crc kubenswrapper[4795]: I0219 23:38:17.862325 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" event={"ID":"df818d88-cec5-4daf-8b17-cc4bb298b498","Type":"ContainerStarted","Data":"c612bf4c4e008dbfc987ab33b3be7f695dac905bf903aefb2bc487385869b975"} Feb 19 23:38:17 crc kubenswrapper[4795]: I0219 23:38:17.862772 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" event={"ID":"df818d88-cec5-4daf-8b17-cc4bb298b498","Type":"ContainerStarted","Data":"3ddf7886211510da97446e708903944220215664f3eae31a24ed524c038decc3"} Feb 19 23:38:17 crc kubenswrapper[4795]: I0219 23:38:17.881411 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" podStartSLOduration=2.376485352 podStartE2EDuration="2.881383398s" podCreationTimestamp="2026-02-19 23:38:15 +0000 UTC" firstStartedPulling="2026-02-19 23:38:16.882508804 +0000 UTC m=+7808.075026708" lastFinishedPulling="2026-02-19 23:38:17.38740689 +0000 UTC m=+7808.579924754" observedRunningTime="2026-02-19 23:38:17.879356581 +0000 UTC m=+7809.071874445" watchObservedRunningTime="2026-02-19 23:38:17.881383398 +0000 UTC m=+7809.073901282" Feb 19 23:38:21 crc kubenswrapper[4795]: I0219 23:38:21.512136 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:38:21 crc kubenswrapper[4795]: E0219 23:38:21.513207 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:38:36 crc kubenswrapper[4795]: I0219 23:38:36.511894 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:38:36 crc kubenswrapper[4795]: E0219 23:38:36.513250 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:38:47 crc kubenswrapper[4795]: I0219 23:38:47.512215 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:38:47 crc kubenswrapper[4795]: E0219 23:38:47.514506 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:39:00 crc kubenswrapper[4795]: I0219 23:39:00.511912 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:39:00 crc kubenswrapper[4795]: E0219 23:39:00.513746 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:39:11 crc kubenswrapper[4795]: I0219 23:39:11.514702 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:39:11 crc kubenswrapper[4795]: E0219 23:39:11.515567 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:39:24 crc kubenswrapper[4795]: I0219 23:39:24.512692 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:39:24 crc kubenswrapper[4795]: E0219 23:39:24.513785 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:39:39 crc kubenswrapper[4795]: I0219 23:39:39.520606 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:39:39 crc kubenswrapper[4795]: E0219 23:39:39.521849 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:39:52 crc kubenswrapper[4795]: I0219 23:39:52.512202 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:39:52 crc kubenswrapper[4795]: E0219 23:39:52.513383 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:40:04 crc kubenswrapper[4795]: I0219 23:40:04.512360 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:40:04 crc kubenswrapper[4795]: E0219 23:40:04.513192 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:40:19 crc kubenswrapper[4795]: I0219 23:40:19.519003 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:40:19 crc kubenswrapper[4795]: E0219 23:40:19.519921 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:40:30 crc kubenswrapper[4795]: I0219 23:40:30.512668 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:40:30 crc kubenswrapper[4795]: E0219 23:40:30.513480 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:40:42 crc kubenswrapper[4795]: I0219 23:40:42.512079 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:40:42 crc kubenswrapper[4795]: E0219 23:40:42.512958 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:40:56 crc kubenswrapper[4795]: I0219 23:40:56.512338 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:40:56 crc kubenswrapper[4795]: E0219 23:40:56.513267 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:40:59 crc kubenswrapper[4795]: I0219 23:40:59.533879 4795 generic.go:334] "Generic (PLEG): container finished" podID="df818d88-cec5-4daf-8b17-cc4bb298b498" containerID="c612bf4c4e008dbfc987ab33b3be7f695dac905bf903aefb2bc487385869b975" exitCode=0 Feb 19 23:40:59 crc kubenswrapper[4795]: I0219 23:40:59.533923 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" event={"ID":"df818d88-cec5-4daf-8b17-cc4bb298b498","Type":"ContainerDied","Data":"c612bf4c4e008dbfc987ab33b3be7f695dac905bf903aefb2bc487385869b975"} Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.020045 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098250 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-0\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098308 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-inventory\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098347 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-1\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098398 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-0\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098432 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ssh-key-openstack-cell1\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098482 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ceph\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098540 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-3\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098581 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2sgz\" (UniqueName: \"kubernetes.io/projected/df818d88-cec5-4daf-8b17-cc4bb298b498-kube-api-access-t2sgz\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098604 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-1\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098631 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-combined-ca-bundle\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098653 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-2\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098702 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-0\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.098743 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-1\") pod \"df818d88-cec5-4daf-8b17-cc4bb298b498\" (UID: \"df818d88-cec5-4daf-8b17-cc4bb298b498\") " Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.117245 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df818d88-cec5-4daf-8b17-cc4bb298b498-kube-api-access-t2sgz" (OuterVolumeSpecName: "kube-api-access-t2sgz") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "kube-api-access-t2sgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.121398 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ceph" (OuterVolumeSpecName: "ceph") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.126308 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.127625 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.129375 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.132580 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.133988 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.139379 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.139699 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-inventory" (OuterVolumeSpecName: "inventory") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.141303 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.143018 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.150414 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.152354 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "df818d88-cec5-4daf-8b17-cc4bb298b498" (UID: "df818d88-cec5-4daf-8b17-cc4bb298b498"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201726 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201760 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201785 4795 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201796 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201805 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201815 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201851 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201860 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201869 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201879 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2sgz\" (UniqueName: \"kubernetes.io/projected/df818d88-cec5-4daf-8b17-cc4bb298b498-kube-api-access-t2sgz\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201888 4795 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201896 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.201926 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/df818d88-cec5-4daf-8b17-cc4bb298b498-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.554497 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" event={"ID":"df818d88-cec5-4daf-8b17-cc4bb298b498","Type":"ContainerDied","Data":"3ddf7886211510da97446e708903944220215664f3eae31a24ed524c038decc3"} Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.554821 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ddf7886211510da97446e708903944220215664f3eae31a24ed524c038decc3" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.554559 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-q26dr" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.650596 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-45fwk"] Feb 19 23:41:01 crc kubenswrapper[4795]: E0219 23:41:01.650988 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df818d88-cec5-4daf-8b17-cc4bb298b498" containerName="nova-cell1-openstack-openstack-cell1" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.651005 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="df818d88-cec5-4daf-8b17-cc4bb298b498" containerName="nova-cell1-openstack-openstack-cell1" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.651239 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="df818d88-cec5-4daf-8b17-cc4bb298b498" containerName="nova-cell1-openstack-openstack-cell1" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.651894 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.654664 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.654950 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.655014 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.654954 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.655181 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.664013 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-45fwk"] Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.710868 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.710916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.711024 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.711054 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-inventory\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.711231 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d52zf\" (UniqueName: \"kubernetes.io/projected/8272a408-0416-4077-9e85-b2962992b3f4-kube-api-access-d52zf\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.711261 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.711286 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.711340 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceph\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.813817 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.813910 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-inventory\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.813988 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d52zf\" (UniqueName: \"kubernetes.io/projected/8272a408-0416-4077-9e85-b2962992b3f4-kube-api-access-d52zf\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.814013 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.814041 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.814062 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceph\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.814135 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.814161 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.817477 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.817539 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.818729 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-inventory\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.818805 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.818850 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.819514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.822695 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceph\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.830768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d52zf\" (UniqueName: \"kubernetes.io/projected/8272a408-0416-4077-9e85-b2962992b3f4-kube-api-access-d52zf\") pod \"telemetry-openstack-openstack-cell1-45fwk\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:01 crc kubenswrapper[4795]: I0219 23:41:01.979948 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:41:02 crc kubenswrapper[4795]: I0219 23:41:02.615817 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:41:02 crc kubenswrapper[4795]: I0219 23:41:02.616130 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-45fwk"] Feb 19 23:41:03 crc kubenswrapper[4795]: I0219 23:41:03.582476 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" event={"ID":"8272a408-0416-4077-9e85-b2962992b3f4","Type":"ContainerStarted","Data":"0b42e21c1af1d461acf5578be3104a023047049f42895449a7c537e578dfd456"} Feb 19 23:41:03 crc kubenswrapper[4795]: I0219 23:41:03.582982 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" event={"ID":"8272a408-0416-4077-9e85-b2962992b3f4","Type":"ContainerStarted","Data":"4bd117515658cc36bb2328744f0a39f6b5d75a26befda093e2b0ac74514155d0"} Feb 19 23:41:03 crc kubenswrapper[4795]: I0219 23:41:03.611047 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" podStartSLOduration=2.151522709 podStartE2EDuration="2.611027004s" podCreationTimestamp="2026-02-19 23:41:01 +0000 UTC" firstStartedPulling="2026-02-19 23:41:02.615533476 +0000 UTC m=+7973.808051360" lastFinishedPulling="2026-02-19 23:41:03.075037791 +0000 UTC m=+7974.267555655" observedRunningTime="2026-02-19 23:41:03.597381196 +0000 UTC m=+7974.789899060" watchObservedRunningTime="2026-02-19 23:41:03.611027004 +0000 UTC m=+7974.803544868" Feb 19 23:41:10 crc kubenswrapper[4795]: I0219 23:41:10.512211 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:41:10 crc kubenswrapper[4795]: E0219 23:41:10.513760 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:41:23 crc kubenswrapper[4795]: I0219 23:41:23.516729 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:41:23 crc kubenswrapper[4795]: E0219 23:41:23.517503 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:41:36 crc kubenswrapper[4795]: I0219 23:41:36.513418 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:41:36 crc kubenswrapper[4795]: I0219 23:41:36.923329 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"9e31f1c0e7be53ce872e54fe0d6436f7bde017185ed0c455c7619a4196124fcf"} Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.607363 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hzj6g"] Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.612286 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.618705 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hzj6g"] Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.704363 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-utilities\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.704420 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mxz9\" (UniqueName: \"kubernetes.io/projected/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-kube-api-access-2mxz9\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.704441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-catalog-content\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.806059 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-utilities\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.806128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mxz9\" (UniqueName: \"kubernetes.io/projected/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-kube-api-access-2mxz9\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.806156 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-catalog-content\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.806763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-catalog-content\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.806977 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-utilities\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.841897 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mxz9\" (UniqueName: \"kubernetes.io/projected/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-kube-api-access-2mxz9\") pod \"community-operators-hzj6g\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:50 crc kubenswrapper[4795]: I0219 23:41:50.935398 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:41:51 crc kubenswrapper[4795]: I0219 23:41:51.536453 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hzj6g"] Feb 19 23:41:52 crc kubenswrapper[4795]: I0219 23:41:52.117366 4795 generic.go:334] "Generic (PLEG): container finished" podID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerID="418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c" exitCode=0 Feb 19 23:41:52 crc kubenswrapper[4795]: I0219 23:41:52.117465 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzj6g" event={"ID":"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54","Type":"ContainerDied","Data":"418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c"} Feb 19 23:41:52 crc kubenswrapper[4795]: I0219 23:41:52.117548 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzj6g" event={"ID":"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54","Type":"ContainerStarted","Data":"10e8ad5bc250f45fd68c45c575016b8ca91733714ac148efd5db92b878d83133"} Feb 19 23:41:53 crc kubenswrapper[4795]: I0219 23:41:53.131072 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzj6g" event={"ID":"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54","Type":"ContainerStarted","Data":"4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd"} Feb 19 23:41:54 crc kubenswrapper[4795]: I0219 23:41:54.146931 4795 generic.go:334] "Generic (PLEG): container finished" podID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerID="4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd" exitCode=0 Feb 19 23:41:54 crc kubenswrapper[4795]: I0219 23:41:54.147161 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzj6g" event={"ID":"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54","Type":"ContainerDied","Data":"4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd"} Feb 19 23:41:55 crc kubenswrapper[4795]: I0219 23:41:55.157029 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzj6g" event={"ID":"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54","Type":"ContainerStarted","Data":"ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11"} Feb 19 23:41:55 crc kubenswrapper[4795]: I0219 23:41:55.181443 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hzj6g" podStartSLOduration=2.7317581459999998 podStartE2EDuration="5.181425216s" podCreationTimestamp="2026-02-19 23:41:50 +0000 UTC" firstStartedPulling="2026-02-19 23:41:52.121456977 +0000 UTC m=+8023.313974841" lastFinishedPulling="2026-02-19 23:41:54.571124057 +0000 UTC m=+8025.763641911" observedRunningTime="2026-02-19 23:41:55.173809675 +0000 UTC m=+8026.366327539" watchObservedRunningTime="2026-02-19 23:41:55.181425216 +0000 UTC m=+8026.373943080" Feb 19 23:42:00 crc kubenswrapper[4795]: I0219 23:42:00.936529 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:42:00 crc kubenswrapper[4795]: I0219 23:42:00.937434 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:42:00 crc kubenswrapper[4795]: I0219 23:42:00.991473 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:42:01 crc kubenswrapper[4795]: I0219 23:42:01.306734 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:42:01 crc kubenswrapper[4795]: I0219 23:42:01.368268 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hzj6g"] Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.254084 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hzj6g" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerName="registry-server" containerID="cri-o://ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11" gracePeriod=2 Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.777750 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.895450 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mxz9\" (UniqueName: \"kubernetes.io/projected/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-kube-api-access-2mxz9\") pod \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.895524 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-catalog-content\") pod \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.895567 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-utilities\") pod \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\" (UID: \"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54\") " Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.897020 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-utilities" (OuterVolumeSpecName: "utilities") pod "d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" (UID: "d66e1e76-05a3-4f86-a4c9-e8cc579ebc54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.904141 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-kube-api-access-2mxz9" (OuterVolumeSpecName: "kube-api-access-2mxz9") pod "d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" (UID: "d66e1e76-05a3-4f86-a4c9-e8cc579ebc54"). InnerVolumeSpecName "kube-api-access-2mxz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.946477 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" (UID: "d66e1e76-05a3-4f86-a4c9-e8cc579ebc54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.998268 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mxz9\" (UniqueName: \"kubernetes.io/projected/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-kube-api-access-2mxz9\") on node \"crc\" DevicePath \"\"" Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.998301 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:42:03 crc kubenswrapper[4795]: I0219 23:42:03.998310 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.271778 4795 generic.go:334] "Generic (PLEG): container finished" podID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerID="ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11" exitCode=0 Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.271837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzj6g" event={"ID":"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54","Type":"ContainerDied","Data":"ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11"} Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.271877 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hzj6g" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.272133 4795 scope.go:117] "RemoveContainer" containerID="ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.272119 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hzj6g" event={"ID":"d66e1e76-05a3-4f86-a4c9-e8cc579ebc54","Type":"ContainerDied","Data":"10e8ad5bc250f45fd68c45c575016b8ca91733714ac148efd5db92b878d83133"} Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.295660 4795 scope.go:117] "RemoveContainer" containerID="4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.350507 4795 scope.go:117] "RemoveContainer" containerID="418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.356571 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hzj6g"] Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.369110 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hzj6g"] Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.416096 4795 scope.go:117] "RemoveContainer" containerID="ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11" Feb 19 23:42:04 crc kubenswrapper[4795]: E0219 23:42:04.417398 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11\": container with ID starting with ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11 not found: ID does not exist" containerID="ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.417610 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11"} err="failed to get container status \"ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11\": rpc error: code = NotFound desc = could not find container \"ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11\": container with ID starting with ea8b5477879331dfa22db3892cbb33ece54e60c2618079a5daca69b7b42fee11 not found: ID does not exist" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.417778 4795 scope.go:117] "RemoveContainer" containerID="4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd" Feb 19 23:42:04 crc kubenswrapper[4795]: E0219 23:42:04.418320 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd\": container with ID starting with 4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd not found: ID does not exist" containerID="4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.418350 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd"} err="failed to get container status \"4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd\": rpc error: code = NotFound desc = could not find container \"4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd\": container with ID starting with 4844b56bccf75eec16c9b232d64d9d730884b6de51940b6c23cdd9f083e74dcd not found: ID does not exist" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.418372 4795 scope.go:117] "RemoveContainer" containerID="418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c" Feb 19 23:42:04 crc kubenswrapper[4795]: E0219 23:42:04.418754 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c\": container with ID starting with 418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c not found: ID does not exist" containerID="418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c" Feb 19 23:42:04 crc kubenswrapper[4795]: I0219 23:42:04.418878 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c"} err="failed to get container status \"418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c\": rpc error: code = NotFound desc = could not find container \"418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c\": container with ID starting with 418c6b5a7ac011e8c420e27f2a710b6f00839117ed0a59e39915cee13e64928c not found: ID does not exist" Feb 19 23:42:05 crc kubenswrapper[4795]: I0219 23:42:05.526501 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" path="/var/lib/kubelet/pods/d66e1e76-05a3-4f86-a4c9-e8cc579ebc54/volumes" Feb 19 23:42:28 crc kubenswrapper[4795]: I0219 23:42:28.462018 4795 scope.go:117] "RemoveContainer" containerID="f30a680eb80ce693ac8d0b06f622aff4dfe172c9fef0f9ac28f9a5b3bf27d44a" Feb 19 23:42:28 crc kubenswrapper[4795]: I0219 23:42:28.495022 4795 scope.go:117] "RemoveContainer" containerID="bc4e8940e2128aa71b442ef79e89ec98fa8c8ba519f41bdd1d4f14b52325a51b" Feb 19 23:42:28 crc kubenswrapper[4795]: I0219 23:42:28.519370 4795 scope.go:117] "RemoveContainer" containerID="075ff9636f71ea6b36cce9312e631fc0c64c7440971ccbcdb446ae2d249ac68d" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.713916 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7c7cc"] Feb 19 23:43:23 crc kubenswrapper[4795]: E0219 23:43:23.715268 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerName="extract-utilities" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.715286 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerName="extract-utilities" Feb 19 23:43:23 crc kubenswrapper[4795]: E0219 23:43:23.715319 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerName="extract-content" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.715327 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerName="extract-content" Feb 19 23:43:23 crc kubenswrapper[4795]: E0219 23:43:23.715360 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerName="registry-server" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.715371 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerName="registry-server" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.715627 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66e1e76-05a3-4f86-a4c9-e8cc579ebc54" containerName="registry-server" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.717884 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.735794 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7c7cc"] Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.864358 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-catalog-content\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.864523 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-utilities\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.864563 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x46l\" (UniqueName: \"kubernetes.io/projected/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-kube-api-access-2x46l\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.967235 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-catalog-content\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.967367 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-utilities\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.967404 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x46l\" (UniqueName: \"kubernetes.io/projected/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-kube-api-access-2x46l\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.967918 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-catalog-content\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.968258 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-utilities\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:23 crc kubenswrapper[4795]: I0219 23:43:23.993201 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x46l\" (UniqueName: \"kubernetes.io/projected/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-kube-api-access-2x46l\") pod \"certified-operators-7c7cc\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:24 crc kubenswrapper[4795]: I0219 23:43:24.048553 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:24 crc kubenswrapper[4795]: I0219 23:43:24.571202 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7c7cc"] Feb 19 23:43:24 crc kubenswrapper[4795]: I0219 23:43:24.603690 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7c7cc" event={"ID":"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb","Type":"ContainerStarted","Data":"debfe03b9665db2cc109523ff1423a76bfce7cdda7d589e75926cc61bee4c9ef"} Feb 19 23:43:25 crc kubenswrapper[4795]: I0219 23:43:25.614074 4795 generic.go:334] "Generic (PLEG): container finished" podID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerID="21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1" exitCode=0 Feb 19 23:43:25 crc kubenswrapper[4795]: I0219 23:43:25.614126 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7c7cc" event={"ID":"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb","Type":"ContainerDied","Data":"21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1"} Feb 19 23:43:26 crc kubenswrapper[4795]: I0219 23:43:26.625635 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7c7cc" event={"ID":"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb","Type":"ContainerStarted","Data":"eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574"} Feb 19 23:43:27 crc kubenswrapper[4795]: I0219 23:43:27.639895 4795 generic.go:334] "Generic (PLEG): container finished" podID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerID="eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574" exitCode=0 Feb 19 23:43:27 crc kubenswrapper[4795]: I0219 23:43:27.640057 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7c7cc" event={"ID":"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb","Type":"ContainerDied","Data":"eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574"} Feb 19 23:43:28 crc kubenswrapper[4795]: I0219 23:43:28.651115 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7c7cc" event={"ID":"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb","Type":"ContainerStarted","Data":"f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a"} Feb 19 23:43:28 crc kubenswrapper[4795]: I0219 23:43:28.673456 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7c7cc" podStartSLOduration=3.205609124 podStartE2EDuration="5.673433796s" podCreationTimestamp="2026-02-19 23:43:23 +0000 UTC" firstStartedPulling="2026-02-19 23:43:25.616611616 +0000 UTC m=+8116.809129480" lastFinishedPulling="2026-02-19 23:43:28.084436288 +0000 UTC m=+8119.276954152" observedRunningTime="2026-02-19 23:43:28.665675001 +0000 UTC m=+8119.858192875" watchObservedRunningTime="2026-02-19 23:43:28.673433796 +0000 UTC m=+8119.865951660" Feb 19 23:43:34 crc kubenswrapper[4795]: I0219 23:43:34.049730 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:34 crc kubenswrapper[4795]: I0219 23:43:34.050482 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:34 crc kubenswrapper[4795]: I0219 23:43:34.092815 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:34 crc kubenswrapper[4795]: I0219 23:43:34.777508 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:34 crc kubenswrapper[4795]: I0219 23:43:34.828177 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7c7cc"] Feb 19 23:43:36 crc kubenswrapper[4795]: I0219 23:43:36.719295 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7c7cc" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerName="registry-server" containerID="cri-o://f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a" gracePeriod=2 Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.249660 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.355017 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x46l\" (UniqueName: \"kubernetes.io/projected/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-kube-api-access-2x46l\") pod \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.355106 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-utilities\") pod \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.355393 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-catalog-content\") pod \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\" (UID: \"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb\") " Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.355963 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-utilities" (OuterVolumeSpecName: "utilities") pod "acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" (UID: "acc1dbdf-a61c-4e68-a49d-33bc1e4396eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.360771 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-kube-api-access-2x46l" (OuterVolumeSpecName: "kube-api-access-2x46l") pod "acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" (UID: "acc1dbdf-a61c-4e68-a49d-33bc1e4396eb"). InnerVolumeSpecName "kube-api-access-2x46l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.457833 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x46l\" (UniqueName: \"kubernetes.io/projected/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-kube-api-access-2x46l\") on node \"crc\" DevicePath \"\"" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.457884 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.543274 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" (UID: "acc1dbdf-a61c-4e68-a49d-33bc1e4396eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.559488 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.730784 4795 generic.go:334] "Generic (PLEG): container finished" podID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerID="f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a" exitCode=0 Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.730831 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7c7cc" event={"ID":"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb","Type":"ContainerDied","Data":"f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a"} Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.731873 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7c7cc" event={"ID":"acc1dbdf-a61c-4e68-a49d-33bc1e4396eb","Type":"ContainerDied","Data":"debfe03b9665db2cc109523ff1423a76bfce7cdda7d589e75926cc61bee4c9ef"} Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.730909 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7c7cc" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.731902 4795 scope.go:117] "RemoveContainer" containerID="f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.769595 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7c7cc"] Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.770835 4795 scope.go:117] "RemoveContainer" containerID="eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.779402 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7c7cc"] Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.794500 4795 scope.go:117] "RemoveContainer" containerID="21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.832662 4795 scope.go:117] "RemoveContainer" containerID="f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a" Feb 19 23:43:37 crc kubenswrapper[4795]: E0219 23:43:37.833319 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a\": container with ID starting with f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a not found: ID does not exist" containerID="f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.833380 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a"} err="failed to get container status \"f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a\": rpc error: code = NotFound desc = could not find container \"f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a\": container with ID starting with f89e5d52a6cbe3e6e5d621bbf7f239549572d0c852602bede32534077cfe914a not found: ID does not exist" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.833419 4795 scope.go:117] "RemoveContainer" containerID="eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574" Feb 19 23:43:37 crc kubenswrapper[4795]: E0219 23:43:37.833860 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574\": container with ID starting with eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574 not found: ID does not exist" containerID="eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.833897 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574"} err="failed to get container status \"eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574\": rpc error: code = NotFound desc = could not find container \"eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574\": container with ID starting with eaa249cd0f1657d583a8712882677e627e7f81412a3b6890b15387bd2353b574 not found: ID does not exist" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.833922 4795 scope.go:117] "RemoveContainer" containerID="21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1" Feb 19 23:43:37 crc kubenswrapper[4795]: E0219 23:43:37.834220 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1\": container with ID starting with 21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1 not found: ID does not exist" containerID="21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1" Feb 19 23:43:37 crc kubenswrapper[4795]: I0219 23:43:37.834268 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1"} err="failed to get container status \"21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1\": rpc error: code = NotFound desc = could not find container \"21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1\": container with ID starting with 21423fc563687e24411e1faae3b78c466da53cae2f1a4436cd02f4f6b90825c1 not found: ID does not exist" Feb 19 23:43:39 crc kubenswrapper[4795]: I0219 23:43:39.529001 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" path="/var/lib/kubelet/pods/acc1dbdf-a61c-4e68-a49d-33bc1e4396eb/volumes" Feb 19 23:43:58 crc kubenswrapper[4795]: I0219 23:43:58.427485 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:43:58 crc kubenswrapper[4795]: I0219 23:43:58.428224 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.143122 4795 generic.go:334] "Generic (PLEG): container finished" podID="8272a408-0416-4077-9e85-b2962992b3f4" containerID="0b42e21c1af1d461acf5578be3104a023047049f42895449a7c537e578dfd456" exitCode=0 Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.143275 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" event={"ID":"8272a408-0416-4077-9e85-b2962992b3f4","Type":"ContainerDied","Data":"0b42e21c1af1d461acf5578be3104a023047049f42895449a7c537e578dfd456"} Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.828966 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lc9vb"] Feb 19 23:44:18 crc kubenswrapper[4795]: E0219 23:44:18.829654 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerName="registry-server" Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.829676 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerName="registry-server" Feb 19 23:44:18 crc kubenswrapper[4795]: E0219 23:44:18.829700 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerName="extract-content" Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.829712 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerName="extract-content" Feb 19 23:44:18 crc kubenswrapper[4795]: E0219 23:44:18.829737 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerName="extract-utilities" Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.829748 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerName="extract-utilities" Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.830127 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="acc1dbdf-a61c-4e68-a49d-33bc1e4396eb" containerName="registry-server" Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.832300 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:18 crc kubenswrapper[4795]: I0219 23:44:18.850517 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lc9vb"] Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.009164 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb7kx\" (UniqueName: \"kubernetes.io/projected/e6a307d7-05c1-401d-af88-11c5da428876-kube-api-access-cb7kx\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.009481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-catalog-content\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.009624 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-utilities\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.110887 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-utilities\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.110983 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb7kx\" (UniqueName: \"kubernetes.io/projected/e6a307d7-05c1-401d-af88-11c5da428876-kube-api-access-cb7kx\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.111005 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-catalog-content\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.111652 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-catalog-content\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.111890 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-utilities\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.132508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb7kx\" (UniqueName: \"kubernetes.io/projected/e6a307d7-05c1-401d-af88-11c5da428876-kube-api-access-cb7kx\") pod \"redhat-operators-lc9vb\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.162436 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.701480 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.833992 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-1\") pod \"8272a408-0416-4077-9e85-b2962992b3f4\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.834075 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ssh-key-openstack-cell1\") pod \"8272a408-0416-4077-9e85-b2962992b3f4\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.834132 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-telemetry-combined-ca-bundle\") pod \"8272a408-0416-4077-9e85-b2962992b3f4\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.834151 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-2\") pod \"8272a408-0416-4077-9e85-b2962992b3f4\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.834293 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-0\") pod \"8272a408-0416-4077-9e85-b2962992b3f4\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.834328 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-inventory\") pod \"8272a408-0416-4077-9e85-b2962992b3f4\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.834402 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceph\") pod \"8272a408-0416-4077-9e85-b2962992b3f4\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.834465 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d52zf\" (UniqueName: \"kubernetes.io/projected/8272a408-0416-4077-9e85-b2962992b3f4-kube-api-access-d52zf\") pod \"8272a408-0416-4077-9e85-b2962992b3f4\" (UID: \"8272a408-0416-4077-9e85-b2962992b3f4\") " Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.860429 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8272a408-0416-4077-9e85-b2962992b3f4-kube-api-access-d52zf" (OuterVolumeSpecName: "kube-api-access-d52zf") pod "8272a408-0416-4077-9e85-b2962992b3f4" (UID: "8272a408-0416-4077-9e85-b2962992b3f4"). InnerVolumeSpecName "kube-api-access-d52zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.860902 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8272a408-0416-4077-9e85-b2962992b3f4" (UID: "8272a408-0416-4077-9e85-b2962992b3f4"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.862399 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceph" (OuterVolumeSpecName: "ceph") pod "8272a408-0416-4077-9e85-b2962992b3f4" (UID: "8272a408-0416-4077-9e85-b2962992b3f4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.870033 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lc9vb"] Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.895399 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "8272a408-0416-4077-9e85-b2962992b3f4" (UID: "8272a408-0416-4077-9e85-b2962992b3f4"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.896807 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-inventory" (OuterVolumeSpecName: "inventory") pod "8272a408-0416-4077-9e85-b2962992b3f4" (UID: "8272a408-0416-4077-9e85-b2962992b3f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.898732 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "8272a408-0416-4077-9e85-b2962992b3f4" (UID: "8272a408-0416-4077-9e85-b2962992b3f4"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.926302 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "8272a408-0416-4077-9e85-b2962992b3f4" (UID: "8272a408-0416-4077-9e85-b2962992b3f4"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.926319 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "8272a408-0416-4077-9e85-b2962992b3f4" (UID: "8272a408-0416-4077-9e85-b2962992b3f4"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.937857 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.937902 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.937913 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.937926 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d52zf\" (UniqueName: \"kubernetes.io/projected/8272a408-0416-4077-9e85-b2962992b3f4-kube-api-access-d52zf\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.937938 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.937950 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.937963 4795 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:19 crc kubenswrapper[4795]: I0219 23:44:19.937974 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8272a408-0416-4077-9e85-b2962992b3f4-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.161801 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" event={"ID":"8272a408-0416-4077-9e85-b2962992b3f4","Type":"ContainerDied","Data":"4bd117515658cc36bb2328744f0a39f6b5d75a26befda093e2b0ac74514155d0"} Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.161846 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bd117515658cc36bb2328744f0a39f6b5d75a26befda093e2b0ac74514155d0" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.161862 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-45fwk" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.163420 4795 generic.go:334] "Generic (PLEG): container finished" podID="e6a307d7-05c1-401d-af88-11c5da428876" containerID="99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf" exitCode=0 Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.163470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc9vb" event={"ID":"e6a307d7-05c1-401d-af88-11c5da428876","Type":"ContainerDied","Data":"99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf"} Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.163503 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc9vb" event={"ID":"e6a307d7-05c1-401d-af88-11c5da428876","Type":"ContainerStarted","Data":"2ed696cd125461c0e2485dab7fecdb64757ed0c48c355f72d39b69f2db14c4c7"} Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.276245 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-56tm4"] Feb 19 23:44:20 crc kubenswrapper[4795]: E0219 23:44:20.276710 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8272a408-0416-4077-9e85-b2962992b3f4" containerName="telemetry-openstack-openstack-cell1" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.276729 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8272a408-0416-4077-9e85-b2962992b3f4" containerName="telemetry-openstack-openstack-cell1" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.276913 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8272a408-0416-4077-9e85-b2962992b3f4" containerName="telemetry-openstack-openstack-cell1" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.277690 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.281411 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.281613 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.281659 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.284607 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.284834 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.285415 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-56tm4"] Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.446745 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np22w\" (UniqueName: \"kubernetes.io/projected/a29cf217-b932-4515-a8e6-4bb762611d24-kube-api-access-np22w\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.447245 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.447272 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.447307 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.447587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.447695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.549997 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.550053 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.550100 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np22w\" (UniqueName: \"kubernetes.io/projected/a29cf217-b932-4515-a8e6-4bb762611d24-kube-api-access-np22w\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.550231 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.550255 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.550283 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.556418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.556430 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.556717 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.557291 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.558191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.566905 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np22w\" (UniqueName: \"kubernetes.io/projected/a29cf217-b932-4515-a8e6-4bb762611d24-kube-api-access-np22w\") pod \"neutron-sriov-openstack-openstack-cell1-56tm4\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:20 crc kubenswrapper[4795]: I0219 23:44:20.614428 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:44:21 crc kubenswrapper[4795]: I0219 23:44:21.176612 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc9vb" event={"ID":"e6a307d7-05c1-401d-af88-11c5da428876","Type":"ContainerStarted","Data":"fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247"} Feb 19 23:44:21 crc kubenswrapper[4795]: I0219 23:44:21.192029 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-56tm4"] Feb 19 23:44:21 crc kubenswrapper[4795]: W0219 23:44:21.198064 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda29cf217_b932_4515_a8e6_4bb762611d24.slice/crio-5aa89fc378f6fea93340176af76e33a4d70c8767a42e44d4fe641addacdbb883 WatchSource:0}: Error finding container 5aa89fc378f6fea93340176af76e33a4d70c8767a42e44d4fe641addacdbb883: Status 404 returned error can't find the container with id 5aa89fc378f6fea93340176af76e33a4d70c8767a42e44d4fe641addacdbb883 Feb 19 23:44:22 crc kubenswrapper[4795]: I0219 23:44:22.189528 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" event={"ID":"a29cf217-b932-4515-a8e6-4bb762611d24","Type":"ContainerStarted","Data":"5aa89fc378f6fea93340176af76e33a4d70c8767a42e44d4fe641addacdbb883"} Feb 19 23:44:23 crc kubenswrapper[4795]: I0219 23:44:23.205517 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" event={"ID":"a29cf217-b932-4515-a8e6-4bb762611d24","Type":"ContainerStarted","Data":"1a18107b1e847d549a8cb6704fc642c32d982fc02150fa86a18638e1eb9c0ebc"} Feb 19 23:44:23 crc kubenswrapper[4795]: I0219 23:44:23.236039 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" podStartSLOduration=2.570472733 podStartE2EDuration="3.23601117s" podCreationTimestamp="2026-02-19 23:44:20 +0000 UTC" firstStartedPulling="2026-02-19 23:44:21.203753503 +0000 UTC m=+8172.396271367" lastFinishedPulling="2026-02-19 23:44:21.86929193 +0000 UTC m=+8173.061809804" observedRunningTime="2026-02-19 23:44:23.222680152 +0000 UTC m=+8174.415198036" watchObservedRunningTime="2026-02-19 23:44:23.23601117 +0000 UTC m=+8174.428529044" Feb 19 23:44:24 crc kubenswrapper[4795]: I0219 23:44:24.216873 4795 generic.go:334] "Generic (PLEG): container finished" podID="e6a307d7-05c1-401d-af88-11c5da428876" containerID="fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247" exitCode=0 Feb 19 23:44:24 crc kubenswrapper[4795]: I0219 23:44:24.216967 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc9vb" event={"ID":"e6a307d7-05c1-401d-af88-11c5da428876","Type":"ContainerDied","Data":"fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247"} Feb 19 23:44:25 crc kubenswrapper[4795]: I0219 23:44:25.234227 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc9vb" event={"ID":"e6a307d7-05c1-401d-af88-11c5da428876","Type":"ContainerStarted","Data":"d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925"} Feb 19 23:44:25 crc kubenswrapper[4795]: I0219 23:44:25.265932 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lc9vb" podStartSLOduration=2.847881875 podStartE2EDuration="7.265913904s" podCreationTimestamp="2026-02-19 23:44:18 +0000 UTC" firstStartedPulling="2026-02-19 23:44:20.165963184 +0000 UTC m=+8171.358481048" lastFinishedPulling="2026-02-19 23:44:24.583995203 +0000 UTC m=+8175.776513077" observedRunningTime="2026-02-19 23:44:25.258366865 +0000 UTC m=+8176.450884749" watchObservedRunningTime="2026-02-19 23:44:25.265913904 +0000 UTC m=+8176.458431768" Feb 19 23:44:28 crc kubenswrapper[4795]: I0219 23:44:28.427961 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:44:28 crc kubenswrapper[4795]: I0219 23:44:28.428618 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:44:29 crc kubenswrapper[4795]: I0219 23:44:29.163996 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:29 crc kubenswrapper[4795]: I0219 23:44:29.164065 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:30 crc kubenswrapper[4795]: I0219 23:44:30.216766 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lc9vb" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="registry-server" probeResult="failure" output=< Feb 19 23:44:30 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:44:30 crc kubenswrapper[4795]: > Feb 19 23:44:39 crc kubenswrapper[4795]: I0219 23:44:39.218129 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:39 crc kubenswrapper[4795]: I0219 23:44:39.269140 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:39 crc kubenswrapper[4795]: I0219 23:44:39.456114 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lc9vb"] Feb 19 23:44:40 crc kubenswrapper[4795]: I0219 23:44:40.383368 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lc9vb" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="registry-server" containerID="cri-o://d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925" gracePeriod=2 Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.032916 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.201153 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-catalog-content\") pod \"e6a307d7-05c1-401d-af88-11c5da428876\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.201270 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb7kx\" (UniqueName: \"kubernetes.io/projected/e6a307d7-05c1-401d-af88-11c5da428876-kube-api-access-cb7kx\") pod \"e6a307d7-05c1-401d-af88-11c5da428876\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.201487 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-utilities\") pod \"e6a307d7-05c1-401d-af88-11c5da428876\" (UID: \"e6a307d7-05c1-401d-af88-11c5da428876\") " Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.202261 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-utilities" (OuterVolumeSpecName: "utilities") pod "e6a307d7-05c1-401d-af88-11c5da428876" (UID: "e6a307d7-05c1-401d-af88-11c5da428876"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.209421 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a307d7-05c1-401d-af88-11c5da428876-kube-api-access-cb7kx" (OuterVolumeSpecName: "kube-api-access-cb7kx") pod "e6a307d7-05c1-401d-af88-11c5da428876" (UID: "e6a307d7-05c1-401d-af88-11c5da428876"). InnerVolumeSpecName "kube-api-access-cb7kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.303885 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb7kx\" (UniqueName: \"kubernetes.io/projected/e6a307d7-05c1-401d-af88-11c5da428876-kube-api-access-cb7kx\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.303920 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.322781 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6a307d7-05c1-401d-af88-11c5da428876" (UID: "e6a307d7-05c1-401d-af88-11c5da428876"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.395796 4795 generic.go:334] "Generic (PLEG): container finished" podID="e6a307d7-05c1-401d-af88-11c5da428876" containerID="d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925" exitCode=0 Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.395897 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lc9vb" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.396683 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc9vb" event={"ID":"e6a307d7-05c1-401d-af88-11c5da428876","Type":"ContainerDied","Data":"d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925"} Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.396806 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lc9vb" event={"ID":"e6a307d7-05c1-401d-af88-11c5da428876","Type":"ContainerDied","Data":"2ed696cd125461c0e2485dab7fecdb64757ed0c48c355f72d39b69f2db14c4c7"} Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.396894 4795 scope.go:117] "RemoveContainer" containerID="d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.406216 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a307d7-05c1-401d-af88-11c5da428876-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.435840 4795 scope.go:117] "RemoveContainer" containerID="fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.441904 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lc9vb"] Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.452406 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lc9vb"] Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.464551 4795 scope.go:117] "RemoveContainer" containerID="99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.509868 4795 scope.go:117] "RemoveContainer" containerID="d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925" Feb 19 23:44:41 crc kubenswrapper[4795]: E0219 23:44:41.510362 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925\": container with ID starting with d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925 not found: ID does not exist" containerID="d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.510391 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925"} err="failed to get container status \"d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925\": rpc error: code = NotFound desc = could not find container \"d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925\": container with ID starting with d625d0002e7a8e1719ea0b0a1f420de6c0a47f0cdc083568ab96fbaff0aa6925 not found: ID does not exist" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.510410 4795 scope.go:117] "RemoveContainer" containerID="fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247" Feb 19 23:44:41 crc kubenswrapper[4795]: E0219 23:44:41.510848 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247\": container with ID starting with fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247 not found: ID does not exist" containerID="fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.510894 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247"} err="failed to get container status \"fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247\": rpc error: code = NotFound desc = could not find container \"fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247\": container with ID starting with fb078423dccc9a88288f14fb047d5ee2ff99fc2cebd5f6d24cb44c6f8cac1247 not found: ID does not exist" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.510921 4795 scope.go:117] "RemoveContainer" containerID="99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf" Feb 19 23:44:41 crc kubenswrapper[4795]: E0219 23:44:41.511223 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf\": container with ID starting with 99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf not found: ID does not exist" containerID="99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.511259 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf"} err="failed to get container status \"99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf\": rpc error: code = NotFound desc = could not find container \"99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf\": container with ID starting with 99964ff6c5bf047856b3662224a9bfd59e713b1e0cc12a1e9a2713e420b021cf not found: ID does not exist" Feb 19 23:44:41 crc kubenswrapper[4795]: I0219 23:44:41.524691 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6a307d7-05c1-401d-af88-11c5da428876" path="/var/lib/kubelet/pods/e6a307d7-05c1-401d-af88-11c5da428876/volumes" Feb 19 23:44:58 crc kubenswrapper[4795]: I0219 23:44:58.427064 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:44:58 crc kubenswrapper[4795]: I0219 23:44:58.427611 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:44:58 crc kubenswrapper[4795]: I0219 23:44:58.427652 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:44:58 crc kubenswrapper[4795]: I0219 23:44:58.428441 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e31f1c0e7be53ce872e54fe0d6436f7bde017185ed0c455c7619a4196124fcf"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:44:58 crc kubenswrapper[4795]: I0219 23:44:58.428485 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://9e31f1c0e7be53ce872e54fe0d6436f7bde017185ed0c455c7619a4196124fcf" gracePeriod=600 Feb 19 23:44:58 crc kubenswrapper[4795]: I0219 23:44:58.589508 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="9e31f1c0e7be53ce872e54fe0d6436f7bde017185ed0c455c7619a4196124fcf" exitCode=0 Feb 19 23:44:58 crc kubenswrapper[4795]: I0219 23:44:58.589548 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"9e31f1c0e7be53ce872e54fe0d6436f7bde017185ed0c455c7619a4196124fcf"} Feb 19 23:44:58 crc kubenswrapper[4795]: I0219 23:44:58.589585 4795 scope.go:117] "RemoveContainer" containerID="75db33e6881035ea4c66ab0cccb431be773bc68e63dc9e6c4480e308b3f4695f" Feb 19 23:44:59 crc kubenswrapper[4795]: I0219 23:44:59.599828 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a"} Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.163805 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j"] Feb 19 23:45:00 crc kubenswrapper[4795]: E0219 23:45:00.164745 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="extract-utilities" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.164770 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="extract-utilities" Feb 19 23:45:00 crc kubenswrapper[4795]: E0219 23:45:00.164796 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="registry-server" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.164805 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="registry-server" Feb 19 23:45:00 crc kubenswrapper[4795]: E0219 23:45:00.164844 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="extract-content" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.164855 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="extract-content" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.165302 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a307d7-05c1-401d-af88-11c5da428876" containerName="registry-server" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.166477 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.168785 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.168963 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.178426 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j"] Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.342434 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khs4c\" (UniqueName: \"kubernetes.io/projected/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-kube-api-access-khs4c\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.342551 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-config-volume\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.342622 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-secret-volume\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.444273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-secret-volume\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.444405 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khs4c\" (UniqueName: \"kubernetes.io/projected/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-kube-api-access-khs4c\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.444513 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-config-volume\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.446376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-config-volume\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.450866 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-secret-volume\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.460508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khs4c\" (UniqueName: \"kubernetes.io/projected/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-kube-api-access-khs4c\") pod \"collect-profiles-29525745-x445j\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:00 crc kubenswrapper[4795]: I0219 23:45:00.504622 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:01 crc kubenswrapper[4795]: I0219 23:45:01.001490 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j"] Feb 19 23:45:01 crc kubenswrapper[4795]: I0219 23:45:01.625026 4795 generic.go:334] "Generic (PLEG): container finished" podID="7b73c4c3-06f1-4f35-b850-7ad4bb65c99d" containerID="0c1d03404adfbd4c18c1ebcc173828750273a5e9125c4f9041a81557f10583df" exitCode=0 Feb 19 23:45:01 crc kubenswrapper[4795]: I0219 23:45:01.625105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" event={"ID":"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d","Type":"ContainerDied","Data":"0c1d03404adfbd4c18c1ebcc173828750273a5e9125c4f9041a81557f10583df"} Feb 19 23:45:01 crc kubenswrapper[4795]: I0219 23:45:01.625330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" event={"ID":"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d","Type":"ContainerStarted","Data":"723918bc990f581f7fd525906c5a8a3950e23b609e1c7a7f8e318a49812fd600"} Feb 19 23:45:02 crc kubenswrapper[4795]: I0219 23:45:02.997834 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.010351 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khs4c\" (UniqueName: \"kubernetes.io/projected/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-kube-api-access-khs4c\") pod \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.010397 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-config-volume\") pod \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.010728 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-secret-volume\") pod \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\" (UID: \"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d\") " Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.011228 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-config-volume" (OuterVolumeSpecName: "config-volume") pod "7b73c4c3-06f1-4f35-b850-7ad4bb65c99d" (UID: "7b73c4c3-06f1-4f35-b850-7ad4bb65c99d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.016349 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7b73c4c3-06f1-4f35-b850-7ad4bb65c99d" (UID: "7b73c4c3-06f1-4f35-b850-7ad4bb65c99d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.017505 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-kube-api-access-khs4c" (OuterVolumeSpecName: "kube-api-access-khs4c") pod "7b73c4c3-06f1-4f35-b850-7ad4bb65c99d" (UID: "7b73c4c3-06f1-4f35-b850-7ad4bb65c99d"). InnerVolumeSpecName "kube-api-access-khs4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.112228 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.112264 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khs4c\" (UniqueName: \"kubernetes.io/projected/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-kube-api-access-khs4c\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.112275 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b73c4c3-06f1-4f35-b850-7ad4bb65c99d-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.645978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" event={"ID":"7b73c4c3-06f1-4f35-b850-7ad4bb65c99d","Type":"ContainerDied","Data":"723918bc990f581f7fd525906c5a8a3950e23b609e1c7a7f8e318a49812fd600"} Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.646492 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="723918bc990f581f7fd525906c5a8a3950e23b609e1c7a7f8e318a49812fd600" Feb 19 23:45:03 crc kubenswrapper[4795]: I0219 23:45:03.646063 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525745-x445j" Feb 19 23:45:04 crc kubenswrapper[4795]: I0219 23:45:04.082282 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68"] Feb 19 23:45:04 crc kubenswrapper[4795]: I0219 23:45:04.091213 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525700-psw68"] Feb 19 23:45:05 crc kubenswrapper[4795]: I0219 23:45:05.527144 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40381732-f007-4395-b8d1-02b3fc37b091" path="/var/lib/kubelet/pods/40381732-f007-4395-b8d1-02b3fc37b091/volumes" Feb 19 23:45:26 crc kubenswrapper[4795]: I0219 23:45:26.883129 4795 generic.go:334] "Generic (PLEG): container finished" podID="a29cf217-b932-4515-a8e6-4bb762611d24" containerID="1a18107b1e847d549a8cb6704fc642c32d982fc02150fa86a18638e1eb9c0ebc" exitCode=0 Feb 19 23:45:26 crc kubenswrapper[4795]: I0219 23:45:26.884586 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" event={"ID":"a29cf217-b932-4515-a8e6-4bb762611d24","Type":"ContainerDied","Data":"1a18107b1e847d549a8cb6704fc642c32d982fc02150fa86a18638e1eb9c0ebc"} Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.429977 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.515405 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-agent-neutron-config-0\") pod \"a29cf217-b932-4515-a8e6-4bb762611d24\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.515509 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np22w\" (UniqueName: \"kubernetes.io/projected/a29cf217-b932-4515-a8e6-4bb762611d24-kube-api-access-np22w\") pod \"a29cf217-b932-4515-a8e6-4bb762611d24\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.515626 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-combined-ca-bundle\") pod \"a29cf217-b932-4515-a8e6-4bb762611d24\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.515721 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ceph\") pod \"a29cf217-b932-4515-a8e6-4bb762611d24\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.515765 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-inventory\") pod \"a29cf217-b932-4515-a8e6-4bb762611d24\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.515806 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ssh-key-openstack-cell1\") pod \"a29cf217-b932-4515-a8e6-4bb762611d24\" (UID: \"a29cf217-b932-4515-a8e6-4bb762611d24\") " Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.533496 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ceph" (OuterVolumeSpecName: "ceph") pod "a29cf217-b932-4515-a8e6-4bb762611d24" (UID: "a29cf217-b932-4515-a8e6-4bb762611d24"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.533534 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "a29cf217-b932-4515-a8e6-4bb762611d24" (UID: "a29cf217-b932-4515-a8e6-4bb762611d24"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.533779 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a29cf217-b932-4515-a8e6-4bb762611d24-kube-api-access-np22w" (OuterVolumeSpecName: "kube-api-access-np22w") pod "a29cf217-b932-4515-a8e6-4bb762611d24" (UID: "a29cf217-b932-4515-a8e6-4bb762611d24"). InnerVolumeSpecName "kube-api-access-np22w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.557462 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "a29cf217-b932-4515-a8e6-4bb762611d24" (UID: "a29cf217-b932-4515-a8e6-4bb762611d24"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.567362 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-inventory" (OuterVolumeSpecName: "inventory") pod "a29cf217-b932-4515-a8e6-4bb762611d24" (UID: "a29cf217-b932-4515-a8e6-4bb762611d24"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.569211 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a29cf217-b932-4515-a8e6-4bb762611d24" (UID: "a29cf217-b932-4515-a8e6-4bb762611d24"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.618077 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.618128 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.618144 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.618161 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.618198 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np22w\" (UniqueName: \"kubernetes.io/projected/a29cf217-b932-4515-a8e6-4bb762611d24-kube-api-access-np22w\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.618216 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a29cf217-b932-4515-a8e6-4bb762611d24-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.672933 4795 scope.go:117] "RemoveContainer" containerID="f0a7532029d7c277b9d534f1fee42695eb4906e52f264e18f9db0ba49bbbdeb7" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.927914 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" event={"ID":"a29cf217-b932-4515-a8e6-4bb762611d24","Type":"ContainerDied","Data":"5aa89fc378f6fea93340176af76e33a4d70c8767a42e44d4fe641addacdbb883"} Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.927954 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aa89fc378f6fea93340176af76e33a4d70c8767a42e44d4fe641addacdbb883" Feb 19 23:45:28 crc kubenswrapper[4795]: I0219 23:45:28.927968 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-56tm4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.043255 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-khvh4"] Feb 19 23:45:29 crc kubenswrapper[4795]: E0219 23:45:29.043675 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b73c4c3-06f1-4f35-b850-7ad4bb65c99d" containerName="collect-profiles" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.043691 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b73c4c3-06f1-4f35-b850-7ad4bb65c99d" containerName="collect-profiles" Feb 19 23:45:29 crc kubenswrapper[4795]: E0219 23:45:29.043734 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29cf217-b932-4515-a8e6-4bb762611d24" containerName="neutron-sriov-openstack-openstack-cell1" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.043743 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29cf217-b932-4515-a8e6-4bb762611d24" containerName="neutron-sriov-openstack-openstack-cell1" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.043937 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a29cf217-b932-4515-a8e6-4bb762611d24" containerName="neutron-sriov-openstack-openstack-cell1" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.043952 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b73c4c3-06f1-4f35-b850-7ad4bb65c99d" containerName="collect-profiles" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.044644 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.066838 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.067121 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.067259 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.067367 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.067473 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.074080 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-khvh4"] Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.130538 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.131008 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spsv6\" (UniqueName: \"kubernetes.io/projected/ff3df901-a0ae-456e-8103-60aaa6439785-kube-api-access-spsv6\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.131099 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.131131 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.131285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.131380 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.232697 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spsv6\" (UniqueName: \"kubernetes.io/projected/ff3df901-a0ae-456e-8103-60aaa6439785-kube-api-access-spsv6\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.232828 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.232862 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.232928 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.232999 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.233068 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.237384 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.237419 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.237519 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.237826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.238253 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.249624 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spsv6\" (UniqueName: \"kubernetes.io/projected/ff3df901-a0ae-456e-8103-60aaa6439785-kube-api-access-spsv6\") pod \"neutron-dhcp-openstack-openstack-cell1-khvh4\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.395768 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:45:29 crc kubenswrapper[4795]: I0219 23:45:29.947561 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-khvh4"] Feb 19 23:45:30 crc kubenswrapper[4795]: I0219 23:45:30.957387 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" event={"ID":"ff3df901-a0ae-456e-8103-60aaa6439785","Type":"ContainerStarted","Data":"c2f202e4a47f5834d5d4ba7b5ccaec0183d8ee7a0ac71915df5f8ba0ec70bb7a"} Feb 19 23:45:30 crc kubenswrapper[4795]: I0219 23:45:30.958573 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" event={"ID":"ff3df901-a0ae-456e-8103-60aaa6439785","Type":"ContainerStarted","Data":"0676394dc314b80aee4627ac3a27539eb9572cc4cc7886c949a8f92d0416d8a3"} Feb 19 23:45:30 crc kubenswrapper[4795]: I0219 23:45:30.979260 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" podStartSLOduration=1.5180314940000001 podStartE2EDuration="1.979241577s" podCreationTimestamp="2026-02-19 23:45:29 +0000 UTC" firstStartedPulling="2026-02-19 23:45:29.961617696 +0000 UTC m=+8241.154135590" lastFinishedPulling="2026-02-19 23:45:30.422827809 +0000 UTC m=+8241.615345673" observedRunningTime="2026-02-19 23:45:30.974044943 +0000 UTC m=+8242.166562817" watchObservedRunningTime="2026-02-19 23:45:30.979241577 +0000 UTC m=+8242.171759451" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.482758 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sbmn2"] Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.486569 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.500835 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbmn2"] Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.556893 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-utilities\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.557034 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-catalog-content\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.557080 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmrfb\" (UniqueName: \"kubernetes.io/projected/b84443bf-9eee-4582-975b-6eb1a02b856b-kube-api-access-dmrfb\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.659088 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-utilities\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.659302 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-catalog-content\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.659377 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmrfb\" (UniqueName: \"kubernetes.io/projected/b84443bf-9eee-4582-975b-6eb1a02b856b-kube-api-access-dmrfb\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.660898 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-catalog-content\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.661157 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-utilities\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.702199 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmrfb\" (UniqueName: \"kubernetes.io/projected/b84443bf-9eee-4582-975b-6eb1a02b856b-kube-api-access-dmrfb\") pod \"redhat-marketplace-sbmn2\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:25 crc kubenswrapper[4795]: I0219 23:46:25.823436 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:26 crc kubenswrapper[4795]: I0219 23:46:26.288892 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbmn2"] Feb 19 23:46:26 crc kubenswrapper[4795]: I0219 23:46:26.606107 4795 generic.go:334] "Generic (PLEG): container finished" podID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerID="15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc" exitCode=0 Feb 19 23:46:26 crc kubenswrapper[4795]: I0219 23:46:26.606335 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbmn2" event={"ID":"b84443bf-9eee-4582-975b-6eb1a02b856b","Type":"ContainerDied","Data":"15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc"} Feb 19 23:46:26 crc kubenswrapper[4795]: I0219 23:46:26.606453 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbmn2" event={"ID":"b84443bf-9eee-4582-975b-6eb1a02b856b","Type":"ContainerStarted","Data":"82b949cb4644af7dac55f5e52cb7a9cd5efb4d958c1a6c32fcc97200612266db"} Feb 19 23:46:26 crc kubenswrapper[4795]: I0219 23:46:26.609359 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:46:27 crc kubenswrapper[4795]: I0219 23:46:27.617893 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbmn2" event={"ID":"b84443bf-9eee-4582-975b-6eb1a02b856b","Type":"ContainerStarted","Data":"9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1"} Feb 19 23:46:28 crc kubenswrapper[4795]: I0219 23:46:28.629570 4795 generic.go:334] "Generic (PLEG): container finished" podID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerID="9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1" exitCode=0 Feb 19 23:46:28 crc kubenswrapper[4795]: I0219 23:46:28.629648 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbmn2" event={"ID":"b84443bf-9eee-4582-975b-6eb1a02b856b","Type":"ContainerDied","Data":"9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1"} Feb 19 23:46:29 crc kubenswrapper[4795]: I0219 23:46:29.641926 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbmn2" event={"ID":"b84443bf-9eee-4582-975b-6eb1a02b856b","Type":"ContainerStarted","Data":"73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4"} Feb 19 23:46:29 crc kubenswrapper[4795]: I0219 23:46:29.666628 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sbmn2" podStartSLOduration=2.230168811 podStartE2EDuration="4.666611764s" podCreationTimestamp="2026-02-19 23:46:25 +0000 UTC" firstStartedPulling="2026-02-19 23:46:26.609067853 +0000 UTC m=+8297.801585727" lastFinishedPulling="2026-02-19 23:46:29.045510816 +0000 UTC m=+8300.238028680" observedRunningTime="2026-02-19 23:46:29.666385928 +0000 UTC m=+8300.858903792" watchObservedRunningTime="2026-02-19 23:46:29.666611764 +0000 UTC m=+8300.859129628" Feb 19 23:46:35 crc kubenswrapper[4795]: I0219 23:46:35.824205 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:35 crc kubenswrapper[4795]: I0219 23:46:35.824806 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:35 crc kubenswrapper[4795]: I0219 23:46:35.877849 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:36 crc kubenswrapper[4795]: I0219 23:46:36.765808 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:36 crc kubenswrapper[4795]: I0219 23:46:36.813091 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbmn2"] Feb 19 23:46:38 crc kubenswrapper[4795]: I0219 23:46:38.733710 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sbmn2" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerName="registry-server" containerID="cri-o://73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4" gracePeriod=2 Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.199088 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.294598 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmrfb\" (UniqueName: \"kubernetes.io/projected/b84443bf-9eee-4582-975b-6eb1a02b856b-kube-api-access-dmrfb\") pod \"b84443bf-9eee-4582-975b-6eb1a02b856b\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.294923 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-utilities\") pod \"b84443bf-9eee-4582-975b-6eb1a02b856b\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.295110 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-catalog-content\") pod \"b84443bf-9eee-4582-975b-6eb1a02b856b\" (UID: \"b84443bf-9eee-4582-975b-6eb1a02b856b\") " Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.295791 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-utilities" (OuterVolumeSpecName: "utilities") pod "b84443bf-9eee-4582-975b-6eb1a02b856b" (UID: "b84443bf-9eee-4582-975b-6eb1a02b856b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.306858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b84443bf-9eee-4582-975b-6eb1a02b856b-kube-api-access-dmrfb" (OuterVolumeSpecName: "kube-api-access-dmrfb") pod "b84443bf-9eee-4582-975b-6eb1a02b856b" (UID: "b84443bf-9eee-4582-975b-6eb1a02b856b"). InnerVolumeSpecName "kube-api-access-dmrfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.316355 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b84443bf-9eee-4582-975b-6eb1a02b856b" (UID: "b84443bf-9eee-4582-975b-6eb1a02b856b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.397644 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.397686 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmrfb\" (UniqueName: \"kubernetes.io/projected/b84443bf-9eee-4582-975b-6eb1a02b856b-kube-api-access-dmrfb\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.397696 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84443bf-9eee-4582-975b-6eb1a02b856b-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.743330 4795 generic.go:334] "Generic (PLEG): container finished" podID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerID="73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4" exitCode=0 Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.743379 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbmn2" event={"ID":"b84443bf-9eee-4582-975b-6eb1a02b856b","Type":"ContainerDied","Data":"73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4"} Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.743411 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbmn2" event={"ID":"b84443bf-9eee-4582-975b-6eb1a02b856b","Type":"ContainerDied","Data":"82b949cb4644af7dac55f5e52cb7a9cd5efb4d958c1a6c32fcc97200612266db"} Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.743433 4795 scope.go:117] "RemoveContainer" containerID="73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.743468 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbmn2" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.770047 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbmn2"] Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.770158 4795 scope.go:117] "RemoveContainer" containerID="9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.784676 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbmn2"] Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.789514 4795 scope.go:117] "RemoveContainer" containerID="15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.837813 4795 scope.go:117] "RemoveContainer" containerID="73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4" Feb 19 23:46:39 crc kubenswrapper[4795]: E0219 23:46:39.843800 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4\": container with ID starting with 73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4 not found: ID does not exist" containerID="73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.843838 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4"} err="failed to get container status \"73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4\": rpc error: code = NotFound desc = could not find container \"73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4\": container with ID starting with 73313296109df7e2e49c16e07eb57c12eccff3c34b12446456875205b34255f4 not found: ID does not exist" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.843862 4795 scope.go:117] "RemoveContainer" containerID="9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1" Feb 19 23:46:39 crc kubenswrapper[4795]: E0219 23:46:39.844259 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1\": container with ID starting with 9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1 not found: ID does not exist" containerID="9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.844285 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1"} err="failed to get container status \"9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1\": rpc error: code = NotFound desc = could not find container \"9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1\": container with ID starting with 9c5d1dddf86e209c0b9a3a26fcbdc4092f27328a5bf6a0d47b8e78aa6b9732b1 not found: ID does not exist" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.844301 4795 scope.go:117] "RemoveContainer" containerID="15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc" Feb 19 23:46:39 crc kubenswrapper[4795]: E0219 23:46:39.844698 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc\": container with ID starting with 15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc not found: ID does not exist" containerID="15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc" Feb 19 23:46:39 crc kubenswrapper[4795]: I0219 23:46:39.844720 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc"} err="failed to get container status \"15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc\": rpc error: code = NotFound desc = could not find container \"15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc\": container with ID starting with 15e2caf1d36a29c75635195d8c25d9abf28cb859ec7996d4c3c0d8dd5e734adc not found: ID does not exist" Feb 19 23:46:41 crc kubenswrapper[4795]: I0219 23:46:41.542979 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" path="/var/lib/kubelet/pods/b84443bf-9eee-4582-975b-6eb1a02b856b/volumes" Feb 19 23:46:46 crc kubenswrapper[4795]: I0219 23:46:46.824959 4795 generic.go:334] "Generic (PLEG): container finished" podID="ff3df901-a0ae-456e-8103-60aaa6439785" containerID="c2f202e4a47f5834d5d4ba7b5ccaec0183d8ee7a0ac71915df5f8ba0ec70bb7a" exitCode=0 Feb 19 23:46:46 crc kubenswrapper[4795]: I0219 23:46:46.825099 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" event={"ID":"ff3df901-a0ae-456e-8103-60aaa6439785","Type":"ContainerDied","Data":"c2f202e4a47f5834d5d4ba7b5ccaec0183d8ee7a0ac71915df5f8ba0ec70bb7a"} Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.248279 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.301099 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-inventory\") pod \"ff3df901-a0ae-456e-8103-60aaa6439785\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.301192 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-combined-ca-bundle\") pod \"ff3df901-a0ae-456e-8103-60aaa6439785\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.301355 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-agent-neutron-config-0\") pod \"ff3df901-a0ae-456e-8103-60aaa6439785\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.301425 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ceph\") pod \"ff3df901-a0ae-456e-8103-60aaa6439785\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.301563 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spsv6\" (UniqueName: \"kubernetes.io/projected/ff3df901-a0ae-456e-8103-60aaa6439785-kube-api-access-spsv6\") pod \"ff3df901-a0ae-456e-8103-60aaa6439785\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.301821 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ssh-key-openstack-cell1\") pod \"ff3df901-a0ae-456e-8103-60aaa6439785\" (UID: \"ff3df901-a0ae-456e-8103-60aaa6439785\") " Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.309379 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ceph" (OuterVolumeSpecName: "ceph") pod "ff3df901-a0ae-456e-8103-60aaa6439785" (UID: "ff3df901-a0ae-456e-8103-60aaa6439785"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.320477 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "ff3df901-a0ae-456e-8103-60aaa6439785" (UID: "ff3df901-a0ae-456e-8103-60aaa6439785"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.334080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff3df901-a0ae-456e-8103-60aaa6439785-kube-api-access-spsv6" (OuterVolumeSpecName: "kube-api-access-spsv6") pod "ff3df901-a0ae-456e-8103-60aaa6439785" (UID: "ff3df901-a0ae-456e-8103-60aaa6439785"). InnerVolumeSpecName "kube-api-access-spsv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.339322 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ff3df901-a0ae-456e-8103-60aaa6439785" (UID: "ff3df901-a0ae-456e-8103-60aaa6439785"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.339423 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "ff3df901-a0ae-456e-8103-60aaa6439785" (UID: "ff3df901-a0ae-456e-8103-60aaa6439785"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.355242 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-inventory" (OuterVolumeSpecName: "inventory") pod "ff3df901-a0ae-456e-8103-60aaa6439785" (UID: "ff3df901-a0ae-456e-8103-60aaa6439785"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.405348 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.405386 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.405404 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.405420 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.405437 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spsv6\" (UniqueName: \"kubernetes.io/projected/ff3df901-a0ae-456e-8103-60aaa6439785-kube-api-access-spsv6\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.405449 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff3df901-a0ae-456e-8103-60aaa6439785-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.843132 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.843018 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-khvh4" event={"ID":"ff3df901-a0ae-456e-8103-60aaa6439785","Type":"ContainerDied","Data":"0676394dc314b80aee4627ac3a27539eb9572cc4cc7886c949a8f92d0416d8a3"} Feb 19 23:46:48 crc kubenswrapper[4795]: I0219 23:46:48.843569 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0676394dc314b80aee4627ac3a27539eb9572cc4cc7886c949a8f92d0416d8a3" Feb 19 23:46:58 crc kubenswrapper[4795]: I0219 23:46:58.427316 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:46:58 crc kubenswrapper[4795]: I0219 23:46:58.427810 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.247765 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.248324 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="761a7217-33fa-4d78-8a05-492cbb33f48d" containerName="nova-cell0-conductor-conductor" containerID="cri-o://2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5" gracePeriod=30 Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.717747 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.718243 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="f1a9135a-42ef-42ca-880a-f4f5ffd78a13" containerName="nova-cell1-conductor-conductor" containerID="cri-o://47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1" gracePeriod=30 Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.878768 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.879260 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-log" containerID="cri-o://ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21" gracePeriod=30 Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.879330 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-api" containerID="cri-o://d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def" gracePeriod=30 Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.896049 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.896469 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2eb28a2e-eb12-4867-9c26-3416349cc1cc" containerName="nova-scheduler-scheduler" containerID="cri-o://1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2" gracePeriod=30 Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.933670 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.934206 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-log" containerID="cri-o://aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b" gracePeriod=30 Feb 19 23:47:00 crc kubenswrapper[4795]: I0219 23:47:00.934625 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-metadata" containerID="cri-o://fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f" gracePeriod=30 Feb 19 23:47:01 crc kubenswrapper[4795]: E0219 23:47:01.322124 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 23:47:01 crc kubenswrapper[4795]: E0219 23:47:01.324220 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 23:47:01 crc kubenswrapper[4795]: E0219 23:47:01.325705 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 19 23:47:01 crc kubenswrapper[4795]: E0219 23:47:01.325746 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="f1a9135a-42ef-42ca-880a-f4f5ffd78a13" containerName="nova-cell1-conductor-conductor" Feb 19 23:47:01 crc kubenswrapper[4795]: I0219 23:47:01.837913 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:01 crc kubenswrapper[4795]: I0219 23:47:01.937719 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-config-data\") pod \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " Feb 19 23:47:01 crc kubenswrapper[4795]: I0219 23:47:01.937833 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnl2g\" (UniqueName: \"kubernetes.io/projected/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-kube-api-access-qnl2g\") pod \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " Feb 19 23:47:01 crc kubenswrapper[4795]: I0219 23:47:01.937893 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-combined-ca-bundle\") pod \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\" (UID: \"f1a9135a-42ef-42ca-880a-f4f5ffd78a13\") " Feb 19 23:47:01 crc kubenswrapper[4795]: I0219 23:47:01.943601 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-kube-api-access-qnl2g" (OuterVolumeSpecName: "kube-api-access-qnl2g") pod "f1a9135a-42ef-42ca-880a-f4f5ffd78a13" (UID: "f1a9135a-42ef-42ca-880a-f4f5ffd78a13"). InnerVolumeSpecName "kube-api-access-qnl2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:47:01 crc kubenswrapper[4795]: I0219 23:47:01.967828 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1a9135a-42ef-42ca-880a-f4f5ffd78a13" (UID: "f1a9135a-42ef-42ca-880a-f4f5ffd78a13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:01 crc kubenswrapper[4795]: I0219 23:47:01.970539 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-config-data" (OuterVolumeSpecName: "config-data") pod "f1a9135a-42ef-42ca-880a-f4f5ffd78a13" (UID: "f1a9135a-42ef-42ca-880a-f4f5ffd78a13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.006732 4795 generic.go:334] "Generic (PLEG): container finished" podID="f1a9135a-42ef-42ca-880a-f4f5ffd78a13" containerID="47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1" exitCode=0 Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.006791 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f1a9135a-42ef-42ca-880a-f4f5ffd78a13","Type":"ContainerDied","Data":"47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1"} Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.006817 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f1a9135a-42ef-42ca-880a-f4f5ffd78a13","Type":"ContainerDied","Data":"2adc08c6dd489704d7eddd8052ac3149a11c14f7992162c2dccd22cfce6e5fe5"} Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.006835 4795 scope.go:117] "RemoveContainer" containerID="47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.006979 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.014131 4795 generic.go:334] "Generic (PLEG): container finished" podID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerID="aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b" exitCode=143 Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.014202 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9","Type":"ContainerDied","Data":"aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b"} Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.016588 4795 generic.go:334] "Generic (PLEG): container finished" podID="8af13c78-4805-4828-980c-45e1defd94c3" containerID="ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21" exitCode=143 Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.016621 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8af13c78-4805-4828-980c-45e1defd94c3","Type":"ContainerDied","Data":"ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21"} Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.040692 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.040727 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnl2g\" (UniqueName: \"kubernetes.io/projected/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-kube-api-access-qnl2g\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.040766 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a9135a-42ef-42ca-880a-f4f5ffd78a13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.061563 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.063238 4795 scope.go:117] "RemoveContainer" containerID="47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1" Feb 19 23:47:02 crc kubenswrapper[4795]: E0219 23:47:02.063604 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1\": container with ID starting with 47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1 not found: ID does not exist" containerID="47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.063636 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1"} err="failed to get container status \"47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1\": rpc error: code = NotFound desc = could not find container \"47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1\": container with ID starting with 47428fb66fcdd0ecf9a1523a5b7b854d3890395cb0c5cd90338ec63eed539ed1 not found: ID does not exist" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.078971 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.094380 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:47:02 crc kubenswrapper[4795]: E0219 23:47:02.094874 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerName="extract-utilities" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.094892 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerName="extract-utilities" Feb 19 23:47:02 crc kubenswrapper[4795]: E0219 23:47:02.094909 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff3df901-a0ae-456e-8103-60aaa6439785" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.094916 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff3df901-a0ae-456e-8103-60aaa6439785" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 19 23:47:02 crc kubenswrapper[4795]: E0219 23:47:02.094929 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a9135a-42ef-42ca-880a-f4f5ffd78a13" containerName="nova-cell1-conductor-conductor" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.094936 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a9135a-42ef-42ca-880a-f4f5ffd78a13" containerName="nova-cell1-conductor-conductor" Feb 19 23:47:02 crc kubenswrapper[4795]: E0219 23:47:02.094963 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerName="registry-server" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.094969 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerName="registry-server" Feb 19 23:47:02 crc kubenswrapper[4795]: E0219 23:47:02.094983 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerName="extract-content" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.094989 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerName="extract-content" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.095212 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a9135a-42ef-42ca-880a-f4f5ffd78a13" containerName="nova-cell1-conductor-conductor" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.095225 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b84443bf-9eee-4582-975b-6eb1a02b856b" containerName="registry-server" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.095234 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff3df901-a0ae-456e-8103-60aaa6439785" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.095957 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.100850 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.105745 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.142601 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fa49294-8a0c-4d98-a388-067bdce0ac1b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.142685 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drtxw\" (UniqueName: \"kubernetes.io/projected/0fa49294-8a0c-4d98-a388-067bdce0ac1b-kube-api-access-drtxw\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.142747 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa49294-8a0c-4d98-a388-067bdce0ac1b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.244355 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fa49294-8a0c-4d98-a388-067bdce0ac1b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.244736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drtxw\" (UniqueName: \"kubernetes.io/projected/0fa49294-8a0c-4d98-a388-067bdce0ac1b-kube-api-access-drtxw\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.244783 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa49294-8a0c-4d98-a388-067bdce0ac1b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.248495 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa49294-8a0c-4d98-a388-067bdce0ac1b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.256288 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fa49294-8a0c-4d98-a388-067bdce0ac1b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.260902 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drtxw\" (UniqueName: \"kubernetes.io/projected/0fa49294-8a0c-4d98-a388-067bdce0ac1b-kube-api-access-drtxw\") pod \"nova-cell1-conductor-0\" (UID: \"0fa49294-8a0c-4d98-a388-067bdce0ac1b\") " pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.468398 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.472986 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.550270 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-combined-ca-bundle\") pod \"761a7217-33fa-4d78-8a05-492cbb33f48d\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.550351 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-config-data\") pod \"761a7217-33fa-4d78-8a05-492cbb33f48d\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.550673 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mzqk\" (UniqueName: \"kubernetes.io/projected/761a7217-33fa-4d78-8a05-492cbb33f48d-kube-api-access-7mzqk\") pod \"761a7217-33fa-4d78-8a05-492cbb33f48d\" (UID: \"761a7217-33fa-4d78-8a05-492cbb33f48d\") " Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.586141 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761a7217-33fa-4d78-8a05-492cbb33f48d-kube-api-access-7mzqk" (OuterVolumeSpecName: "kube-api-access-7mzqk") pod "761a7217-33fa-4d78-8a05-492cbb33f48d" (UID: "761a7217-33fa-4d78-8a05-492cbb33f48d"). InnerVolumeSpecName "kube-api-access-7mzqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.623298 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-config-data" (OuterVolumeSpecName: "config-data") pod "761a7217-33fa-4d78-8a05-492cbb33f48d" (UID: "761a7217-33fa-4d78-8a05-492cbb33f48d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.638759 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "761a7217-33fa-4d78-8a05-492cbb33f48d" (UID: "761a7217-33fa-4d78-8a05-492cbb33f48d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.656634 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mzqk\" (UniqueName: \"kubernetes.io/projected/761a7217-33fa-4d78-8a05-492cbb33f48d-kube-api-access-7mzqk\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.656674 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:02 crc kubenswrapper[4795]: I0219 23:47:02.656683 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761a7217-33fa-4d78-8a05-492cbb33f48d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.026769 4795 generic.go:334] "Generic (PLEG): container finished" podID="761a7217-33fa-4d78-8a05-492cbb33f48d" containerID="2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5" exitCode=0 Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.026816 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"761a7217-33fa-4d78-8a05-492cbb33f48d","Type":"ContainerDied","Data":"2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5"} Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.026862 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"761a7217-33fa-4d78-8a05-492cbb33f48d","Type":"ContainerDied","Data":"a920c7c53728d52d3ab518fdecf0b6800cb795ab36fabc65145920815940fa68"} Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.026880 4795 scope.go:117] "RemoveContainer" containerID="2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.026825 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.058839 4795 scope.go:117] "RemoveContainer" containerID="2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5" Feb 19 23:47:03 crc kubenswrapper[4795]: E0219 23:47:03.059200 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5\": container with ID starting with 2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5 not found: ID does not exist" containerID="2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.059243 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5"} err="failed to get container status \"2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5\": rpc error: code = NotFound desc = could not find container \"2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5\": container with ID starting with 2e039fa6d9d7121c3b280b416aad376a4ff38189c9a9b31473131235502fb9b5 not found: ID does not exist" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.063477 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.078716 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.094572 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.105270 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:47:03 crc kubenswrapper[4795]: E0219 23:47:03.105752 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761a7217-33fa-4d78-8a05-492cbb33f48d" containerName="nova-cell0-conductor-conductor" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.105777 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="761a7217-33fa-4d78-8a05-492cbb33f48d" containerName="nova-cell0-conductor-conductor" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.106036 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="761a7217-33fa-4d78-8a05-492cbb33f48d" containerName="nova-cell0-conductor-conductor" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.106905 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.109002 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.118402 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.167009 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d27d8041-4940-4cd2-bf9e-02b7aa924067-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.167693 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d27d8041-4940-4cd2-bf9e-02b7aa924067-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.167910 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xpls\" (UniqueName: \"kubernetes.io/projected/d27d8041-4940-4cd2-bf9e-02b7aa924067-kube-api-access-6xpls\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.269200 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d27d8041-4940-4cd2-bf9e-02b7aa924067-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.269374 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d27d8041-4940-4cd2-bf9e-02b7aa924067-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.269430 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xpls\" (UniqueName: \"kubernetes.io/projected/d27d8041-4940-4cd2-bf9e-02b7aa924067-kube-api-access-6xpls\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.274630 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d27d8041-4940-4cd2-bf9e-02b7aa924067-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.276155 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d27d8041-4940-4cd2-bf9e-02b7aa924067-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.286574 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xpls\" (UniqueName: \"kubernetes.io/projected/d27d8041-4940-4cd2-bf9e-02b7aa924067-kube-api-access-6xpls\") pod \"nova-cell0-conductor-0\" (UID: \"d27d8041-4940-4cd2-bf9e-02b7aa924067\") " pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.536046 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761a7217-33fa-4d78-8a05-492cbb33f48d" path="/var/lib/kubelet/pods/761a7217-33fa-4d78-8a05-492cbb33f48d/volumes" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.536991 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a9135a-42ef-42ca-880a-f4f5ffd78a13" path="/var/lib/kubelet/pods/f1a9135a-42ef-42ca-880a-f4f5ffd78a13/volumes" Feb 19 23:47:03 crc kubenswrapper[4795]: I0219 23:47:03.548029 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.021523 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 23:47:04 crc kubenswrapper[4795]: W0219 23:47:04.030804 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd27d8041_4940_4cd2_bf9e_02b7aa924067.slice/crio-01cdf5eb54fb6ab079bfe1c87f56fed5c2e6903145b84ea12364255cd0de0dd9 WatchSource:0}: Error finding container 01cdf5eb54fb6ab079bfe1c87f56fed5c2e6903145b84ea12364255cd0de0dd9: Status 404 returned error can't find the container with id 01cdf5eb54fb6ab079bfe1c87f56fed5c2e6903145b84ea12364255cd0de0dd9 Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.040400 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0fa49294-8a0c-4d98-a388-067bdce0ac1b","Type":"ContainerStarted","Data":"5f5d0f57bbef303932012df35b0d662bd9583e35a0c826f07fa25fce61b48e82"} Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.040633 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0fa49294-8a0c-4d98-a388-067bdce0ac1b","Type":"ContainerStarted","Data":"a3b717517b1ba0d636e16f433517f2ea2922b0dd075ee9edc9b8d78b4178b2c6"} Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.041872 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.061009 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.060989618 podStartE2EDuration="2.060989618s" podCreationTimestamp="2026-02-19 23:47:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:47:04.059820496 +0000 UTC m=+8335.252338360" watchObservedRunningTime="2026-02-19 23:47:04.060989618 +0000 UTC m=+8335.253507482" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.089764 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": read tcp 10.217.0.2:43098->10.217.1.83:8775: read: connection reset by peer" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.090150 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": read tcp 10.217.0.2:43096->10.217.1.83:8775: read: connection reset by peer" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.493850 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.596048 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-combined-ca-bundle\") pod \"8af13c78-4805-4828-980c-45e1defd94c3\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.596188 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af13c78-4805-4828-980c-45e1defd94c3-logs\") pod \"8af13c78-4805-4828-980c-45e1defd94c3\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.596245 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2xch\" (UniqueName: \"kubernetes.io/projected/8af13c78-4805-4828-980c-45e1defd94c3-kube-api-access-r2xch\") pod \"8af13c78-4805-4828-980c-45e1defd94c3\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.596279 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-config-data\") pod \"8af13c78-4805-4828-980c-45e1defd94c3\" (UID: \"8af13c78-4805-4828-980c-45e1defd94c3\") " Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.598680 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8af13c78-4805-4828-980c-45e1defd94c3-logs" (OuterVolumeSpecName: "logs") pod "8af13c78-4805-4828-980c-45e1defd94c3" (UID: "8af13c78-4805-4828-980c-45e1defd94c3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.603670 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8af13c78-4805-4828-980c-45e1defd94c3-kube-api-access-r2xch" (OuterVolumeSpecName: "kube-api-access-r2xch") pod "8af13c78-4805-4828-980c-45e1defd94c3" (UID: "8af13c78-4805-4828-980c-45e1defd94c3"). InnerVolumeSpecName "kube-api-access-r2xch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.633826 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-config-data" (OuterVolumeSpecName: "config-data") pod "8af13c78-4805-4828-980c-45e1defd94c3" (UID: "8af13c78-4805-4828-980c-45e1defd94c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.644552 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8af13c78-4805-4828-980c-45e1defd94c3" (UID: "8af13c78-4805-4828-980c-45e1defd94c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.645103 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.701726 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dq6w\" (UniqueName: \"kubernetes.io/projected/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-kube-api-access-2dq6w\") pod \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.701822 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-logs\") pod \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.701924 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-combined-ca-bundle\") pod \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.702111 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-config-data\") pod \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\" (UID: \"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9\") " Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.702609 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-logs" (OuterVolumeSpecName: "logs") pod "88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" (UID: "88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.702641 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.702655 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8af13c78-4805-4828-980c-45e1defd94c3-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.702666 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2xch\" (UniqueName: \"kubernetes.io/projected/8af13c78-4805-4828-980c-45e1defd94c3-kube-api-access-r2xch\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.702675 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af13c78-4805-4828-980c-45e1defd94c3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.712851 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-kube-api-access-2dq6w" (OuterVolumeSpecName: "kube-api-access-2dq6w") pod "88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" (UID: "88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9"). InnerVolumeSpecName "kube-api-access-2dq6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.747397 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" (UID: "88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.765520 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-config-data" (OuterVolumeSpecName: "config-data") pod "88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" (UID: "88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.804020 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.804049 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dq6w\" (UniqueName: \"kubernetes.io/projected/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-kube-api-access-2dq6w\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.804058 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-logs\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:04 crc kubenswrapper[4795]: I0219 23:47:04.804066 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.054423 4795 generic.go:334] "Generic (PLEG): container finished" podID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerID="fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f" exitCode=0 Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.054492 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.054534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9","Type":"ContainerDied","Data":"fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f"} Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.058305 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9","Type":"ContainerDied","Data":"df7150bd3c379f19d4f935bb8e348093119703bff34dd1ad6781416721057a60"} Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.058337 4795 scope.go:117] "RemoveContainer" containerID="fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.060760 4795 generic.go:334] "Generic (PLEG): container finished" podID="8af13c78-4805-4828-980c-45e1defd94c3" containerID="d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def" exitCode=0 Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.060808 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.060847 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8af13c78-4805-4828-980c-45e1defd94c3","Type":"ContainerDied","Data":"d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def"} Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.060880 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8af13c78-4805-4828-980c-45e1defd94c3","Type":"ContainerDied","Data":"7ee07f55b63d65c8ea13f8ca8377dd262c2422aff92a6a2abfe47d6fef72c015"} Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.064255 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d27d8041-4940-4cd2-bf9e-02b7aa924067","Type":"ContainerStarted","Data":"fe60ce54dc10f3166342ae8d794a6371165923563cbaf10fe016273bac89a75e"} Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.064297 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.064308 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d27d8041-4940-4cd2-bf9e-02b7aa924067","Type":"ContainerStarted","Data":"01cdf5eb54fb6ab079bfe1c87f56fed5c2e6903145b84ea12364255cd0de0dd9"} Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.086464 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.086446035 podStartE2EDuration="2.086446035s" podCreationTimestamp="2026-02-19 23:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:47:05.077462717 +0000 UTC m=+8336.269980581" watchObservedRunningTime="2026-02-19 23:47:05.086446035 +0000 UTC m=+8336.278963899" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.205634 4795 scope.go:117] "RemoveContainer" containerID="aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.236637 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.249802 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266076 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.266587 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-log" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266612 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-log" Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.266632 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-log" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266641 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-log" Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.266661 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-api" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266667 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-api" Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.266678 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-metadata" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266684 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-metadata" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266911 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-api" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266934 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8af13c78-4805-4828-980c-45e1defd94c3" containerName="nova-api-log" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266957 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-log" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.266966 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" containerName="nova-metadata-metadata" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.269234 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.282059 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.292036 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.315672 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644df2e5-37fd-468b-9e52-316d44e65f69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.316125 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5s2s\" (UniqueName: \"kubernetes.io/projected/644df2e5-37fd-468b-9e52-316d44e65f69-kube-api-access-q5s2s\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.316191 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/644df2e5-37fd-468b-9e52-316d44e65f69-logs\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.316231 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/644df2e5-37fd-468b-9e52-316d44e65f69-config-data\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.319824 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.345537 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2 is running failed: container process not found" containerID="1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.347306 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2 is running failed: container process not found" containerID="1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.360313 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2 is running failed: container process not found" containerID="1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.360388 4795 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2eb28a2e-eb12-4867-9c26-3416349cc1cc" containerName="nova-scheduler-scheduler" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.360684 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.377490 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.385632 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.387371 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.388345 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.401148 4795 scope.go:117] "RemoveContainer" containerID="fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f" Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.402284 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f\": container with ID starting with fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f not found: ID does not exist" containerID="fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.402311 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f"} err="failed to get container status \"fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f\": rpc error: code = NotFound desc = could not find container \"fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f\": container with ID starting with fd1e78c709cd03e12813d0773547c4465f0e66a7a0cdec8f42bc3426e77cff3f not found: ID does not exist" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.402332 4795 scope.go:117] "RemoveContainer" containerID="aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b" Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.405791 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b\": container with ID starting with aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b not found: ID does not exist" containerID="aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.405826 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b"} err="failed to get container status \"aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b\": rpc error: code = NotFound desc = could not find container \"aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b\": container with ID starting with aa2cfffccfdef39a961925828c69395a846784ec188eaf37747eb10bd8158d2b not found: ID does not exist" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.405850 4795 scope.go:117] "RemoveContainer" containerID="d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.417990 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644df2e5-37fd-468b-9e52-316d44e65f69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.418048 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5s2s\" (UniqueName: \"kubernetes.io/projected/644df2e5-37fd-468b-9e52-316d44e65f69-kube-api-access-q5s2s\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.418093 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/644df2e5-37fd-468b-9e52-316d44e65f69-logs\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.418131 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/644df2e5-37fd-468b-9e52-316d44e65f69-config-data\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.418785 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/644df2e5-37fd-468b-9e52-316d44e65f69-logs\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.434780 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644df2e5-37fd-468b-9e52-316d44e65f69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.435222 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/644df2e5-37fd-468b-9e52-316d44e65f69-config-data\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.438861 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5s2s\" (UniqueName: \"kubernetes.io/projected/644df2e5-37fd-468b-9e52-316d44e65f69-kube-api-access-q5s2s\") pod \"nova-api-0\" (UID: \"644df2e5-37fd-468b-9e52-316d44e65f69\") " pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.449891 4795 scope.go:117] "RemoveContainer" containerID="ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.519940 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.520065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-config-data\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.520103 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-logs\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.520130 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmpsv\" (UniqueName: \"kubernetes.io/projected/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-kube-api-access-tmpsv\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.533492 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9" path="/var/lib/kubelet/pods/88b2e1dc-e56a-4d7d-9e9e-d57d4b0d1af9/volumes" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.537513 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8af13c78-4805-4828-980c-45e1defd94c3" path="/var/lib/kubelet/pods/8af13c78-4805-4828-980c-45e1defd94c3/volumes" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.583989 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.590900 4795 scope.go:117] "RemoveContainer" containerID="d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def" Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.591435 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def\": container with ID starting with d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def not found: ID does not exist" containerID="d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.591486 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def"} err="failed to get container status \"d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def\": rpc error: code = NotFound desc = could not find container \"d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def\": container with ID starting with d925bab645dff9712b2a171f2f4482d18d0a0ee4391b990a9a7a705d61ad3def not found: ID does not exist" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.591513 4795 scope.go:117] "RemoveContainer" containerID="ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21" Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.591948 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21\": container with ID starting with ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21 not found: ID does not exist" containerID="ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.591973 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21"} err="failed to get container status \"ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21\": rpc error: code = NotFound desc = could not find container \"ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21\": container with ID starting with ad930281e03b5e1a65091cba120348431f474125eb5bc690c29d8ca4f3247c21 not found: ID does not exist" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.594943 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h"] Feb 19 23:47:05 crc kubenswrapper[4795]: E0219 23:47:05.595597 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eb28a2e-eb12-4867-9c26-3416349cc1cc" containerName="nova-scheduler-scheduler" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.595669 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eb28a2e-eb12-4867-9c26-3416349cc1cc" containerName="nova-scheduler-scheduler" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.595912 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eb28a2e-eb12-4867-9c26-3416349cc1cc" containerName="nova-scheduler-scheduler" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.597506 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.602658 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.603563 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.604357 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.604542 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.604779 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-brdnv" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.609609 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.609761 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.617642 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.639711 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-combined-ca-bundle\") pod \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.641444 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dpxs\" (UniqueName: \"kubernetes.io/projected/2eb28a2e-eb12-4867-9c26-3416349cc1cc-kube-api-access-4dpxs\") pod \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.641563 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-config-data\") pod \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\" (UID: \"2eb28a2e-eb12-4867-9c26-3416349cc1cc\") " Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.642273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-config-data\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.645150 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-logs\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.642986 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h"] Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.646941 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-logs\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.648299 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmpsv\" (UniqueName: \"kubernetes.io/projected/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-kube-api-access-tmpsv\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.648708 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eb28a2e-eb12-4867-9c26-3416349cc1cc-kube-api-access-4dpxs" (OuterVolumeSpecName: "kube-api-access-4dpxs") pod "2eb28a2e-eb12-4867-9c26-3416349cc1cc" (UID: "2eb28a2e-eb12-4867-9c26-3416349cc1cc"). InnerVolumeSpecName "kube-api-access-4dpxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.649236 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.649710 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dpxs\" (UniqueName: \"kubernetes.io/projected/2eb28a2e-eb12-4867-9c26-3416349cc1cc-kube-api-access-4dpxs\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.653346 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-config-data\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.669183 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmpsv\" (UniqueName: \"kubernetes.io/projected/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-kube-api-access-tmpsv\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.669324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15a66bd-d8e7-4ad0-a8bc-7575a218f50c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c\") " pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.693812 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2eb28a2e-eb12-4867-9c26-3416349cc1cc" (UID: "2eb28a2e-eb12-4867-9c26-3416349cc1cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.706827 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-config-data" (OuterVolumeSpecName: "config-data") pod "2eb28a2e-eb12-4867-9c26-3416349cc1cc" (UID: "2eb28a2e-eb12-4867-9c26-3416349cc1cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.730828 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.784702 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785041 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785090 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785131 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785351 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785400 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785466 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785533 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785636 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785754 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785788 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785859 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785897 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwhdv\" (UniqueName: \"kubernetes.io/projected/59981ca7-620e-4025-b165-4f54f920e8f2-kube-api-access-jwhdv\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.785984 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.786000 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eb28a2e-eb12-4867-9c26-3416349cc1cc-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.889702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.889767 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.889896 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.889913 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwhdv\" (UniqueName: \"kubernetes.io/projected/59981ca7-620e-4025-b165-4f54f920e8f2-kube-api-access-jwhdv\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890041 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890083 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890118 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890153 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890206 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890233 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890278 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890336 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.890379 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.891103 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.892086 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.897157 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.899850 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.902742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.903705 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.903785 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.903884 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.904225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.906826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.907566 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.909485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.915004 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwhdv\" (UniqueName: \"kubernetes.io/projected/59981ca7-620e-4025-b165-4f54f920e8f2-kube-api-access-jwhdv\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:05 crc kubenswrapper[4795]: I0219 23:47:05.926805 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.101915 4795 generic.go:334] "Generic (PLEG): container finished" podID="2eb28a2e-eb12-4867-9c26-3416349cc1cc" containerID="1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2" exitCode=0 Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.102217 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2eb28a2e-eb12-4867-9c26-3416349cc1cc","Type":"ContainerDied","Data":"1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2"} Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.102245 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2eb28a2e-eb12-4867-9c26-3416349cc1cc","Type":"ContainerDied","Data":"93528f93304bf625c4781fad623cd7f9e1b6953a05d729d20b96627d846cf536"} Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.102260 4795 scope.go:117] "RemoveContainer" containerID="1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.102401 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.148151 4795 scope.go:117] "RemoveContainer" containerID="1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2" Feb 19 23:47:06 crc kubenswrapper[4795]: E0219 23:47:06.151635 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2\": container with ID starting with 1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2 not found: ID does not exist" containerID="1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.151682 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2"} err="failed to get container status \"1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2\": rpc error: code = NotFound desc = could not find container \"1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2\": container with ID starting with 1232c55ba259e6d6fbb743a58698f5bf867025931147d0f23b6c47a0784bdaa2 not found: ID does not exist" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.184397 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.208779 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.232260 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.233771 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.236377 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.262331 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.276442 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.316039 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 23:47:06 crc kubenswrapper[4795]: W0219 23:47:06.329861 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd15a66bd_d8e7_4ad0_a8bc_7575a218f50c.slice/crio-269ea3a3cc7c782717513c519defdf6e40bbb4f24b5f8b9cc525bbe3b1585b80 WatchSource:0}: Error finding container 269ea3a3cc7c782717513c519defdf6e40bbb4f24b5f8b9cc525bbe3b1585b80: Status 404 returned error can't find the container with id 269ea3a3cc7c782717513c519defdf6e40bbb4f24b5f8b9cc525bbe3b1585b80 Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.407930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d16cd452-43cb-42e4-b4af-6de3271d7194-config-data\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.408150 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggvmf\" (UniqueName: \"kubernetes.io/projected/d16cd452-43cb-42e4-b4af-6de3271d7194-kube-api-access-ggvmf\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.408297 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16cd452-43cb-42e4-b4af-6de3271d7194-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.509687 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggvmf\" (UniqueName: \"kubernetes.io/projected/d16cd452-43cb-42e4-b4af-6de3271d7194-kube-api-access-ggvmf\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.509768 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16cd452-43cb-42e4-b4af-6de3271d7194-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.509838 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d16cd452-43cb-42e4-b4af-6de3271d7194-config-data\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.519961 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d16cd452-43cb-42e4-b4af-6de3271d7194-config-data\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.520122 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d16cd452-43cb-42e4-b4af-6de3271d7194-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.527880 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggvmf\" (UniqueName: \"kubernetes.io/projected/d16cd452-43cb-42e4-b4af-6de3271d7194-kube-api-access-ggvmf\") pod \"nova-scheduler-0\" (UID: \"d16cd452-43cb-42e4-b4af-6de3271d7194\") " pod="openstack/nova-scheduler-0" Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.571624 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h"] Feb 19 23:47:06 crc kubenswrapper[4795]: I0219 23:47:06.723702 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.091837 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.141619 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"644df2e5-37fd-468b-9e52-316d44e65f69","Type":"ContainerStarted","Data":"0f5fd6cdf83515d2353737625145defcce8f805453e8e49904eda86c74087d38"} Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.141671 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"644df2e5-37fd-468b-9e52-316d44e65f69","Type":"ContainerStarted","Data":"2df44d43566b44017947ac4a40886bbe921efeaa0dfe870b97a2200c128147a8"} Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.141685 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"644df2e5-37fd-468b-9e52-316d44e65f69","Type":"ContainerStarted","Data":"f310f9961bfa7e30dccb9b463be3fd81c2a95eff37d9a32e545f107088030354"} Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.146698 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d16cd452-43cb-42e4-b4af-6de3271d7194","Type":"ContainerStarted","Data":"bb9261dd0e2bd9d5c273788bbe975558f6574c3d54d777e4a936c18b717d1be3"} Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.165157 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" event={"ID":"59981ca7-620e-4025-b165-4f54f920e8f2","Type":"ContainerStarted","Data":"775470b3cc1f8cc005107bb43cf5801c2166fca168c3fce4a4df488ec55ed38a"} Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.172590 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.172571975 podStartE2EDuration="2.172571975s" podCreationTimestamp="2026-02-19 23:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:47:07.168643407 +0000 UTC m=+8338.361161271" watchObservedRunningTime="2026-02-19 23:47:07.172571975 +0000 UTC m=+8338.365089839" Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.187296 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c","Type":"ContainerStarted","Data":"375b22bf9ffd8064d8642f2b8d04f6a411a967b3310e411f925cc15f15fc29e4"} Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.187356 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c","Type":"ContainerStarted","Data":"581ca652c9580fd55c1e58d31ee1935d33bc98270dac22244ea7159db91b1f7c"} Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.187368 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d15a66bd-d8e7-4ad0-a8bc-7575a218f50c","Type":"ContainerStarted","Data":"269ea3a3cc7c782717513c519defdf6e40bbb4f24b5f8b9cc525bbe3b1585b80"} Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.205106 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.205087055 podStartE2EDuration="2.205087055s" podCreationTimestamp="2026-02-19 23:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:47:07.203362877 +0000 UTC m=+8338.395880751" watchObservedRunningTime="2026-02-19 23:47:07.205087055 +0000 UTC m=+8338.397604919" Feb 19 23:47:07 crc kubenswrapper[4795]: I0219 23:47:07.551716 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eb28a2e-eb12-4867-9c26-3416349cc1cc" path="/var/lib/kubelet/pods/2eb28a2e-eb12-4867-9c26-3416349cc1cc/volumes" Feb 19 23:47:08 crc kubenswrapper[4795]: I0219 23:47:08.206104 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d16cd452-43cb-42e4-b4af-6de3271d7194","Type":"ContainerStarted","Data":"435a833fa4fef5d0a6c7a1dbfb25df6372e9d725b8d48ca4e2085b2c4259562a"} Feb 19 23:47:08 crc kubenswrapper[4795]: I0219 23:47:08.210432 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" event={"ID":"59981ca7-620e-4025-b165-4f54f920e8f2","Type":"ContainerStarted","Data":"a825f4508bbcb5fe04299d8794a0d2181819f8e6cdbaee23877c97437e821ffd"} Feb 19 23:47:08 crc kubenswrapper[4795]: I0219 23:47:08.236855 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.236828937 podStartE2EDuration="2.236828937s" podCreationTimestamp="2026-02-19 23:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:47:08.224329891 +0000 UTC m=+8339.416847745" watchObservedRunningTime="2026-02-19 23:47:08.236828937 +0000 UTC m=+8339.429346801" Feb 19 23:47:08 crc kubenswrapper[4795]: I0219 23:47:08.249994 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" podStartSLOduration=2.8040324500000002 podStartE2EDuration="3.24997587s" podCreationTimestamp="2026-02-19 23:47:05 +0000 UTC" firstStartedPulling="2026-02-19 23:47:06.586369164 +0000 UTC m=+8337.778887028" lastFinishedPulling="2026-02-19 23:47:07.032312594 +0000 UTC m=+8338.224830448" observedRunningTime="2026-02-19 23:47:08.248560381 +0000 UTC m=+8339.441078245" watchObservedRunningTime="2026-02-19 23:47:08.24997587 +0000 UTC m=+8339.442493734" Feb 19 23:47:10 crc kubenswrapper[4795]: I0219 23:47:10.732683 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 23:47:10 crc kubenswrapper[4795]: I0219 23:47:10.734085 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 23:47:11 crc kubenswrapper[4795]: I0219 23:47:11.724661 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 23:47:12 crc kubenswrapper[4795]: I0219 23:47:12.519099 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 23:47:13 crc kubenswrapper[4795]: I0219 23:47:13.575120 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 23:47:15 crc kubenswrapper[4795]: I0219 23:47:15.618190 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 23:47:15 crc kubenswrapper[4795]: I0219 23:47:15.618823 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 23:47:15 crc kubenswrapper[4795]: I0219 23:47:15.733138 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 23:47:15 crc kubenswrapper[4795]: I0219 23:47:15.733232 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 23:47:16 crc kubenswrapper[4795]: I0219 23:47:16.701516 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="644df2e5-37fd-468b-9e52-316d44e65f69" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:47:16 crc kubenswrapper[4795]: I0219 23:47:16.701548 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="644df2e5-37fd-468b-9e52-316d44e65f69" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:47:16 crc kubenswrapper[4795]: I0219 23:47:16.724617 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 23:47:16 crc kubenswrapper[4795]: I0219 23:47:16.774559 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d15a66bd-d8e7-4ad0-a8bc-7575a218f50c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.191:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:47:16 crc kubenswrapper[4795]: I0219 23:47:16.798903 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 23:47:16 crc kubenswrapper[4795]: I0219 23:47:16.815869 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d15a66bd-d8e7-4ad0-a8bc-7575a218f50c" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.191:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 23:47:17 crc kubenswrapper[4795]: I0219 23:47:17.344818 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 23:47:25 crc kubenswrapper[4795]: I0219 23:47:25.622359 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 23:47:25 crc kubenswrapper[4795]: I0219 23:47:25.623795 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 23:47:25 crc kubenswrapper[4795]: I0219 23:47:25.626848 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 23:47:25 crc kubenswrapper[4795]: I0219 23:47:25.628473 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 23:47:25 crc kubenswrapper[4795]: I0219 23:47:25.734436 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 23:47:25 crc kubenswrapper[4795]: I0219 23:47:25.734598 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 23:47:25 crc kubenswrapper[4795]: I0219 23:47:25.736010 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 23:47:25 crc kubenswrapper[4795]: I0219 23:47:25.737027 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 23:47:26 crc kubenswrapper[4795]: I0219 23:47:26.401120 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 23:47:26 crc kubenswrapper[4795]: I0219 23:47:26.407520 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 23:47:28 crc kubenswrapper[4795]: I0219 23:47:28.427224 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:47:28 crc kubenswrapper[4795]: I0219 23:47:28.427618 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.427706 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.428310 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.428377 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.429554 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.429648 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" gracePeriod=600 Feb 19 23:47:58 crc kubenswrapper[4795]: E0219 23:47:58.554198 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.772744 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" exitCode=0 Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.772819 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a"} Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.772895 4795 scope.go:117] "RemoveContainer" containerID="9e31f1c0e7be53ce872e54fe0d6436f7bde017185ed0c455c7619a4196124fcf" Feb 19 23:47:58 crc kubenswrapper[4795]: I0219 23:47:58.774015 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:47:58 crc kubenswrapper[4795]: E0219 23:47:58.774546 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:48:09 crc kubenswrapper[4795]: I0219 23:48:09.530502 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:48:09 crc kubenswrapper[4795]: E0219 23:48:09.532223 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:48:20 crc kubenswrapper[4795]: I0219 23:48:20.512549 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:48:20 crc kubenswrapper[4795]: E0219 23:48:20.513476 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:48:33 crc kubenswrapper[4795]: I0219 23:48:33.512286 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:48:33 crc kubenswrapper[4795]: E0219 23:48:33.513551 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:48:47 crc kubenswrapper[4795]: I0219 23:48:47.511963 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:48:47 crc kubenswrapper[4795]: E0219 23:48:47.512997 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:48:58 crc kubenswrapper[4795]: I0219 23:48:58.511509 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:48:58 crc kubenswrapper[4795]: E0219 23:48:58.512450 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:49:11 crc kubenswrapper[4795]: I0219 23:49:11.511721 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:49:11 crc kubenswrapper[4795]: E0219 23:49:11.513025 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:49:25 crc kubenswrapper[4795]: I0219 23:49:25.512128 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:49:25 crc kubenswrapper[4795]: E0219 23:49:25.512875 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:49:37 crc kubenswrapper[4795]: I0219 23:49:37.511645 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:49:37 crc kubenswrapper[4795]: E0219 23:49:37.512537 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:49:49 crc kubenswrapper[4795]: I0219 23:49:49.518227 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:49:49 crc kubenswrapper[4795]: E0219 23:49:49.519011 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:50:04 crc kubenswrapper[4795]: I0219 23:50:04.512593 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:50:04 crc kubenswrapper[4795]: E0219 23:50:04.513379 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:50:18 crc kubenswrapper[4795]: I0219 23:50:18.512313 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:50:18 crc kubenswrapper[4795]: E0219 23:50:18.513119 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:50:23 crc kubenswrapper[4795]: I0219 23:50:23.301142 4795 generic.go:334] "Generic (PLEG): container finished" podID="59981ca7-620e-4025-b165-4f54f920e8f2" containerID="a825f4508bbcb5fe04299d8794a0d2181819f8e6cdbaee23877c97437e821ffd" exitCode=0 Feb 19 23:50:23 crc kubenswrapper[4795]: I0219 23:50:23.301261 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" event={"ID":"59981ca7-620e-4025-b165-4f54f920e8f2","Type":"ContainerDied","Data":"a825f4508bbcb5fe04299d8794a0d2181819f8e6cdbaee23877c97437e821ffd"} Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.779438 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.956644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwhdv\" (UniqueName: \"kubernetes.io/projected/59981ca7-620e-4025-b165-4f54f920e8f2-kube-api-access-jwhdv\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.956693 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-2\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.956725 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-1\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.956743 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-0\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.956802 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-combined-ca-bundle\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.956845 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ssh-key-openstack-cell1\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.956902 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-1\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.957065 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-1\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.957112 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-0\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.957145 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ceph\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.957190 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-0\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.957230 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-inventory\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.957268 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-3\") pod \"59981ca7-620e-4025-b165-4f54f920e8f2\" (UID: \"59981ca7-620e-4025-b165-4f54f920e8f2\") " Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.962221 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59981ca7-620e-4025-b165-4f54f920e8f2-kube-api-access-jwhdv" (OuterVolumeSpecName: "kube-api-access-jwhdv") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "kube-api-access-jwhdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.967514 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.976908 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ceph" (OuterVolumeSpecName: "ceph") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:24 crc kubenswrapper[4795]: I0219 23:50:24.992414 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-inventory" (OuterVolumeSpecName: "inventory") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.004989 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.008921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.013342 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.013957 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.015462 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.015870 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.015994 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.019563 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.023495 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "59981ca7-620e-4025-b165-4f54f920e8f2" (UID: "59981ca7-620e-4025-b165-4f54f920e8f2"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.059887 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060253 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060268 4795 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060280 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060295 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060308 4795 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-ceph\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060320 4795 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060329 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060337 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060347 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwhdv\" (UniqueName: \"kubernetes.io/projected/59981ca7-620e-4025-b165-4f54f920e8f2-kube-api-access-jwhdv\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060355 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060366 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.060378 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/59981ca7-620e-4025-b165-4f54f920e8f2-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.323701 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" event={"ID":"59981ca7-620e-4025-b165-4f54f920e8f2","Type":"ContainerDied","Data":"775470b3cc1f8cc005107bb43cf5801c2166fca168c3fce4a4df488ec55ed38a"} Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.323739 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="775470b3cc1f8cc005107bb43cf5801c2166fca168c3fce4a4df488ec55ed38a" Feb 19 23:50:25 crc kubenswrapper[4795]: I0219 23:50:25.323773 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h" Feb 19 23:50:30 crc kubenswrapper[4795]: I0219 23:50:30.512418 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:50:30 crc kubenswrapper[4795]: E0219 23:50:30.513054 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:50:45 crc kubenswrapper[4795]: I0219 23:50:45.512841 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:50:45 crc kubenswrapper[4795]: E0219 23:50:45.514922 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:50:58 crc kubenswrapper[4795]: I0219 23:50:58.511838 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:50:58 crc kubenswrapper[4795]: E0219 23:50:58.512692 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:51:10 crc kubenswrapper[4795]: I0219 23:51:10.513284 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:51:10 crc kubenswrapper[4795]: E0219 23:51:10.514372 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:51:18 crc kubenswrapper[4795]: I0219 23:51:18.748586 4795 trace.go:236] Trace[1281311727]: "Calculate volume metrics of ovn-data for pod openstack/ovn-copy-data" (19-Feb-2026 23:51:17.717) (total time: 1030ms): Feb 19 23:51:18 crc kubenswrapper[4795]: Trace[1281311727]: [1.030662002s] [1.030662002s] END Feb 19 23:51:25 crc kubenswrapper[4795]: I0219 23:51:25.514121 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:51:25 crc kubenswrapper[4795]: E0219 23:51:25.515102 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:51:37 crc kubenswrapper[4795]: I0219 23:51:37.512251 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:51:37 crc kubenswrapper[4795]: E0219 23:51:37.513660 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:51:49 crc kubenswrapper[4795]: I0219 23:51:49.520875 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:51:49 crc kubenswrapper[4795]: E0219 23:51:49.524093 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:52:02 crc kubenswrapper[4795]: I0219 23:52:02.514141 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:52:02 crc kubenswrapper[4795]: E0219 23:52:02.515015 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:52:14 crc kubenswrapper[4795]: I0219 23:52:14.511512 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:52:14 crc kubenswrapper[4795]: E0219 23:52:14.512493 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:52:29 crc kubenswrapper[4795]: I0219 23:52:29.525366 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:52:29 crc kubenswrapper[4795]: E0219 23:52:29.526732 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:52:40 crc kubenswrapper[4795]: I0219 23:52:40.512061 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:52:40 crc kubenswrapper[4795]: E0219 23:52:40.512736 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:52:55 crc kubenswrapper[4795]: I0219 23:52:55.512471 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:52:55 crc kubenswrapper[4795]: E0219 23:52:55.513517 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:53:06 crc kubenswrapper[4795]: I0219 23:53:06.512126 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:53:07 crc kubenswrapper[4795]: I0219 23:53:07.050028 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"826efb3421363865a6a92aeeb46a1de0d922f1bfcfa12ff4f740f04d52a6b3e6"} Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.569057 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-47mst"] Feb 19 23:53:15 crc kubenswrapper[4795]: E0219 23:53:15.571095 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59981ca7-620e-4025-b165-4f54f920e8f2" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.571277 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="59981ca7-620e-4025-b165-4f54f920e8f2" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.571557 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="59981ca7-620e-4025-b165-4f54f920e8f2" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.573412 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.602024 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-47mst"] Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.751080 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-catalog-content\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.751454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-utilities\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.751666 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fllv\" (UniqueName: \"kubernetes.io/projected/3e0ad945-0620-480a-8200-fce17a619511-kube-api-access-7fllv\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.853984 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-utilities\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.854583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fllv\" (UniqueName: \"kubernetes.io/projected/3e0ad945-0620-480a-8200-fce17a619511-kube-api-access-7fllv\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.854726 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-utilities\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.855383 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-catalog-content\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.855828 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-catalog-content\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.885062 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fllv\" (UniqueName: \"kubernetes.io/projected/3e0ad945-0620-480a-8200-fce17a619511-kube-api-access-7fllv\") pod \"community-operators-47mst\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:15 crc kubenswrapper[4795]: I0219 23:53:15.907383 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:16 crc kubenswrapper[4795]: I0219 23:53:16.434821 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-47mst"] Feb 19 23:53:17 crc kubenswrapper[4795]: I0219 23:53:17.155319 4795 generic.go:334] "Generic (PLEG): container finished" podID="3e0ad945-0620-480a-8200-fce17a619511" containerID="c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3" exitCode=0 Feb 19 23:53:17 crc kubenswrapper[4795]: I0219 23:53:17.155399 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47mst" event={"ID":"3e0ad945-0620-480a-8200-fce17a619511","Type":"ContainerDied","Data":"c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3"} Feb 19 23:53:17 crc kubenswrapper[4795]: I0219 23:53:17.155651 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47mst" event={"ID":"3e0ad945-0620-480a-8200-fce17a619511","Type":"ContainerStarted","Data":"654266e485a03d2ad418348459ffcdd5cbe44c629d72206aae359e312bfb70eb"} Feb 19 23:53:17 crc kubenswrapper[4795]: I0219 23:53:17.157264 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 23:53:18 crc kubenswrapper[4795]: I0219 23:53:18.168980 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47mst" event={"ID":"3e0ad945-0620-480a-8200-fce17a619511","Type":"ContainerStarted","Data":"245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8"} Feb 19 23:53:19 crc kubenswrapper[4795]: I0219 23:53:19.182632 4795 generic.go:334] "Generic (PLEG): container finished" podID="3e0ad945-0620-480a-8200-fce17a619511" containerID="245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8" exitCode=0 Feb 19 23:53:19 crc kubenswrapper[4795]: I0219 23:53:19.182728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47mst" event={"ID":"3e0ad945-0620-480a-8200-fce17a619511","Type":"ContainerDied","Data":"245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8"} Feb 19 23:53:20 crc kubenswrapper[4795]: I0219 23:53:20.073221 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 19 23:53:20 crc kubenswrapper[4795]: I0219 23:53:20.073724 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="4f232979-ab9c-4b59-8ad8-7756367fe0bf" containerName="adoption" containerID="cri-o://d99c65ed2cf2832977c62a47557eeea8eec734877d891b4ac4fe2f4a681f7224" gracePeriod=30 Feb 19 23:53:20 crc kubenswrapper[4795]: I0219 23:53:20.194479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47mst" event={"ID":"3e0ad945-0620-480a-8200-fce17a619511","Type":"ContainerStarted","Data":"bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc"} Feb 19 23:53:20 crc kubenswrapper[4795]: I0219 23:53:20.222857 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-47mst" podStartSLOduration=2.815418555 podStartE2EDuration="5.22283707s" podCreationTimestamp="2026-02-19 23:53:15 +0000 UTC" firstStartedPulling="2026-02-19 23:53:17.156997619 +0000 UTC m=+8708.349515493" lastFinishedPulling="2026-02-19 23:53:19.564416134 +0000 UTC m=+8710.756934008" observedRunningTime="2026-02-19 23:53:20.215897246 +0000 UTC m=+8711.408415130" watchObservedRunningTime="2026-02-19 23:53:20.22283707 +0000 UTC m=+8711.415354934" Feb 19 23:53:25 crc kubenswrapper[4795]: I0219 23:53:25.908449 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:25 crc kubenswrapper[4795]: I0219 23:53:25.909119 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:25 crc kubenswrapper[4795]: I0219 23:53:25.965274 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:26 crc kubenswrapper[4795]: I0219 23:53:26.303667 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.210221 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-47mst"] Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.293798 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-47mst" podUID="3e0ad945-0620-480a-8200-fce17a619511" containerName="registry-server" containerID="cri-o://bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc" gracePeriod=2 Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.826343 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.974549 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fllv\" (UniqueName: \"kubernetes.io/projected/3e0ad945-0620-480a-8200-fce17a619511-kube-api-access-7fllv\") pod \"3e0ad945-0620-480a-8200-fce17a619511\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.974642 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-catalog-content\") pod \"3e0ad945-0620-480a-8200-fce17a619511\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.974747 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-utilities\") pod \"3e0ad945-0620-480a-8200-fce17a619511\" (UID: \"3e0ad945-0620-480a-8200-fce17a619511\") " Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.975902 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-utilities" (OuterVolumeSpecName: "utilities") pod "3e0ad945-0620-480a-8200-fce17a619511" (UID: "3e0ad945-0620-480a-8200-fce17a619511"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.976582 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:29 crc kubenswrapper[4795]: I0219 23:53:29.982710 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e0ad945-0620-480a-8200-fce17a619511-kube-api-access-7fllv" (OuterVolumeSpecName: "kube-api-access-7fllv") pod "3e0ad945-0620-480a-8200-fce17a619511" (UID: "3e0ad945-0620-480a-8200-fce17a619511"). InnerVolumeSpecName "kube-api-access-7fllv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.039447 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e0ad945-0620-480a-8200-fce17a619511" (UID: "3e0ad945-0620-480a-8200-fce17a619511"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.079021 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fllv\" (UniqueName: \"kubernetes.io/projected/3e0ad945-0620-480a-8200-fce17a619511-kube-api-access-7fllv\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.079065 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e0ad945-0620-480a-8200-fce17a619511-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.217008 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4cfwt"] Feb 19 23:53:30 crc kubenswrapper[4795]: E0219 23:53:30.217551 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e0ad945-0620-480a-8200-fce17a619511" containerName="extract-content" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.217571 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e0ad945-0620-480a-8200-fce17a619511" containerName="extract-content" Feb 19 23:53:30 crc kubenswrapper[4795]: E0219 23:53:30.217598 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e0ad945-0620-480a-8200-fce17a619511" containerName="registry-server" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.217605 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e0ad945-0620-480a-8200-fce17a619511" containerName="registry-server" Feb 19 23:53:30 crc kubenswrapper[4795]: E0219 23:53:30.217616 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e0ad945-0620-480a-8200-fce17a619511" containerName="extract-utilities" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.217623 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e0ad945-0620-480a-8200-fce17a619511" containerName="extract-utilities" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.217966 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e0ad945-0620-480a-8200-fce17a619511" containerName="registry-server" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.220918 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.229055 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4cfwt"] Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.309089 4795 generic.go:334] "Generic (PLEG): container finished" podID="3e0ad945-0620-480a-8200-fce17a619511" containerID="bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc" exitCode=0 Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.309191 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47mst" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.309200 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47mst" event={"ID":"3e0ad945-0620-480a-8200-fce17a619511","Type":"ContainerDied","Data":"bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc"} Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.309267 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47mst" event={"ID":"3e0ad945-0620-480a-8200-fce17a619511","Type":"ContainerDied","Data":"654266e485a03d2ad418348459ffcdd5cbe44c629d72206aae359e312bfb70eb"} Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.309289 4795 scope.go:117] "RemoveContainer" containerID="bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.343070 4795 scope.go:117] "RemoveContainer" containerID="245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.352416 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-47mst"] Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.364752 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-47mst"] Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.375990 4795 scope.go:117] "RemoveContainer" containerID="c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.389013 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jdmz\" (UniqueName: \"kubernetes.io/projected/2031ff34-f306-4920-8bb4-a6f0151a9aa3-kube-api-access-2jdmz\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.389082 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-utilities\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.389210 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-catalog-content\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.431091 4795 scope.go:117] "RemoveContainer" containerID="bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc" Feb 19 23:53:30 crc kubenswrapper[4795]: E0219 23:53:30.431521 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc\": container with ID starting with bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc not found: ID does not exist" containerID="bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.431578 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc"} err="failed to get container status \"bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc\": rpc error: code = NotFound desc = could not find container \"bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc\": container with ID starting with bfc69a308b4874cc4701238f4bc637dd5837640b9708a43dce7266489d8f2bbc not found: ID does not exist" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.431605 4795 scope.go:117] "RemoveContainer" containerID="245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8" Feb 19 23:53:30 crc kubenswrapper[4795]: E0219 23:53:30.431932 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8\": container with ID starting with 245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8 not found: ID does not exist" containerID="245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.431951 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8"} err="failed to get container status \"245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8\": rpc error: code = NotFound desc = could not find container \"245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8\": container with ID starting with 245578835a2e0844e5239253327e73fad61978cfe7130dfde66805d05d7a4ed8 not found: ID does not exist" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.431966 4795 scope.go:117] "RemoveContainer" containerID="c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3" Feb 19 23:53:30 crc kubenswrapper[4795]: E0219 23:53:30.432528 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3\": container with ID starting with c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3 not found: ID does not exist" containerID="c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.432548 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3"} err="failed to get container status \"c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3\": rpc error: code = NotFound desc = could not find container \"c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3\": container with ID starting with c5dcdf6c1cf69827f802b079407c9d34bb28958b817ae561694de607237e5ca3 not found: ID does not exist" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.491563 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-catalog-content\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.491717 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jdmz\" (UniqueName: \"kubernetes.io/projected/2031ff34-f306-4920-8bb4-a6f0151a9aa3-kube-api-access-2jdmz\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.491791 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-utilities\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.492315 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-utilities\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.493578 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-catalog-content\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.507740 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jdmz\" (UniqueName: \"kubernetes.io/projected/2031ff34-f306-4920-8bb4-a6f0151a9aa3-kube-api-access-2jdmz\") pod \"certified-operators-4cfwt\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:30 crc kubenswrapper[4795]: I0219 23:53:30.595367 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:31 crc kubenswrapper[4795]: I0219 23:53:31.116589 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4cfwt"] Feb 19 23:53:31 crc kubenswrapper[4795]: I0219 23:53:31.319622 4795 generic.go:334] "Generic (PLEG): container finished" podID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerID="9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75" exitCode=0 Feb 19 23:53:31 crc kubenswrapper[4795]: I0219 23:53:31.319799 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cfwt" event={"ID":"2031ff34-f306-4920-8bb4-a6f0151a9aa3","Type":"ContainerDied","Data":"9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75"} Feb 19 23:53:31 crc kubenswrapper[4795]: I0219 23:53:31.320000 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cfwt" event={"ID":"2031ff34-f306-4920-8bb4-a6f0151a9aa3","Type":"ContainerStarted","Data":"40a03b7739c80ba579fdcb69feece003a5e727cd18891934c7e249fece457390"} Feb 19 23:53:31 crc kubenswrapper[4795]: I0219 23:53:31.523293 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e0ad945-0620-480a-8200-fce17a619511" path="/var/lib/kubelet/pods/3e0ad945-0620-480a-8200-fce17a619511/volumes" Feb 19 23:53:32 crc kubenswrapper[4795]: I0219 23:53:32.335224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cfwt" event={"ID":"2031ff34-f306-4920-8bb4-a6f0151a9aa3","Type":"ContainerStarted","Data":"b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351"} Feb 19 23:53:33 crc kubenswrapper[4795]: I0219 23:53:33.349791 4795 generic.go:334] "Generic (PLEG): container finished" podID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerID="b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351" exitCode=0 Feb 19 23:53:33 crc kubenswrapper[4795]: I0219 23:53:33.349921 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cfwt" event={"ID":"2031ff34-f306-4920-8bb4-a6f0151a9aa3","Type":"ContainerDied","Data":"b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351"} Feb 19 23:53:34 crc kubenswrapper[4795]: I0219 23:53:34.361946 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cfwt" event={"ID":"2031ff34-f306-4920-8bb4-a6f0151a9aa3","Type":"ContainerStarted","Data":"b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5"} Feb 19 23:53:34 crc kubenswrapper[4795]: I0219 23:53:34.393925 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4cfwt" podStartSLOduration=1.968909531 podStartE2EDuration="4.39390489s" podCreationTimestamp="2026-02-19 23:53:30 +0000 UTC" firstStartedPulling="2026-02-19 23:53:31.321682439 +0000 UTC m=+8722.514200303" lastFinishedPulling="2026-02-19 23:53:33.746677798 +0000 UTC m=+8724.939195662" observedRunningTime="2026-02-19 23:53:34.38356481 +0000 UTC m=+8725.576082674" watchObservedRunningTime="2026-02-19 23:53:34.39390489 +0000 UTC m=+8725.586422754" Feb 19 23:53:40 crc kubenswrapper[4795]: I0219 23:53:40.595980 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:40 crc kubenswrapper[4795]: I0219 23:53:40.596683 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:40 crc kubenswrapper[4795]: I0219 23:53:40.652821 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:41 crc kubenswrapper[4795]: I0219 23:53:41.498292 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:41 crc kubenswrapper[4795]: I0219 23:53:41.560405 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4cfwt"] Feb 19 23:53:43 crc kubenswrapper[4795]: I0219 23:53:43.447122 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4cfwt" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerName="registry-server" containerID="cri-o://b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5" gracePeriod=2 Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.018958 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.101944 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jdmz\" (UniqueName: \"kubernetes.io/projected/2031ff34-f306-4920-8bb4-a6f0151a9aa3-kube-api-access-2jdmz\") pod \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.102217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-utilities\") pod \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.102386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-catalog-content\") pod \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\" (UID: \"2031ff34-f306-4920-8bb4-a6f0151a9aa3\") " Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.103080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-utilities" (OuterVolumeSpecName: "utilities") pod "2031ff34-f306-4920-8bb4-a6f0151a9aa3" (UID: "2031ff34-f306-4920-8bb4-a6f0151a9aa3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.107586 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2031ff34-f306-4920-8bb4-a6f0151a9aa3-kube-api-access-2jdmz" (OuterVolumeSpecName: "kube-api-access-2jdmz") pod "2031ff34-f306-4920-8bb4-a6f0151a9aa3" (UID: "2031ff34-f306-4920-8bb4-a6f0151a9aa3"). InnerVolumeSpecName "kube-api-access-2jdmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.150706 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2031ff34-f306-4920-8bb4-a6f0151a9aa3" (UID: "2031ff34-f306-4920-8bb4-a6f0151a9aa3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.205810 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jdmz\" (UniqueName: \"kubernetes.io/projected/2031ff34-f306-4920-8bb4-a6f0151a9aa3-kube-api-access-2jdmz\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.206258 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.206379 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2031ff34-f306-4920-8bb4-a6f0151a9aa3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.458019 4795 generic.go:334] "Generic (PLEG): container finished" podID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerID="b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5" exitCode=0 Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.458159 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cfwt" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.458236 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cfwt" event={"ID":"2031ff34-f306-4920-8bb4-a6f0151a9aa3","Type":"ContainerDied","Data":"b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5"} Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.459735 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cfwt" event={"ID":"2031ff34-f306-4920-8bb4-a6f0151a9aa3","Type":"ContainerDied","Data":"40a03b7739c80ba579fdcb69feece003a5e727cd18891934c7e249fece457390"} Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.459806 4795 scope.go:117] "RemoveContainer" containerID="b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.494558 4795 scope.go:117] "RemoveContainer" containerID="b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.508499 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4cfwt"] Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.519035 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4cfwt"] Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.520494 4795 scope.go:117] "RemoveContainer" containerID="9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.565356 4795 scope.go:117] "RemoveContainer" containerID="b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5" Feb 19 23:53:44 crc kubenswrapper[4795]: E0219 23:53:44.566302 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5\": container with ID starting with b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5 not found: ID does not exist" containerID="b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.566332 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5"} err="failed to get container status \"b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5\": rpc error: code = NotFound desc = could not find container \"b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5\": container with ID starting with b16b8fdd2b82f14fd0ed7943a563799d24eebd99f90be20d9c9ad6bd42d06ba5 not found: ID does not exist" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.566351 4795 scope.go:117] "RemoveContainer" containerID="b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351" Feb 19 23:53:44 crc kubenswrapper[4795]: E0219 23:53:44.566861 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351\": container with ID starting with b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351 not found: ID does not exist" containerID="b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.566912 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351"} err="failed to get container status \"b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351\": rpc error: code = NotFound desc = could not find container \"b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351\": container with ID starting with b5afe98365fbeba289f9d0a07bee96b2746c1c0aa55a58675b59b6f922d5b351 not found: ID does not exist" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.566952 4795 scope.go:117] "RemoveContainer" containerID="9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75" Feb 19 23:53:44 crc kubenswrapper[4795]: E0219 23:53:44.567286 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75\": container with ID starting with 9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75 not found: ID does not exist" containerID="9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75" Feb 19 23:53:44 crc kubenswrapper[4795]: I0219 23:53:44.567311 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75"} err="failed to get container status \"9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75\": rpc error: code = NotFound desc = could not find container \"9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75\": container with ID starting with 9b76206f0a3519b063ea89a91d4b6dc9fa7aef45f124e3d60d49621c49a85c75 not found: ID does not exist" Feb 19 23:53:45 crc kubenswrapper[4795]: I0219 23:53:45.527436 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" path="/var/lib/kubelet/pods/2031ff34-f306-4920-8bb4-a6f0151a9aa3/volumes" Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.522033 4795 generic.go:334] "Generic (PLEG): container finished" podID="4f232979-ab9c-4b59-8ad8-7756367fe0bf" containerID="d99c65ed2cf2832977c62a47557eeea8eec734877d891b4ac4fe2f4a681f7224" exitCode=137 Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.522995 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"4f232979-ab9c-4b59-8ad8-7756367fe0bf","Type":"ContainerDied","Data":"d99c65ed2cf2832977c62a47557eeea8eec734877d891b4ac4fe2f4a681f7224"} Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.683740 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.793619 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\") pod \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") " Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.793803 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzsrb\" (UniqueName: \"kubernetes.io/projected/4f232979-ab9c-4b59-8ad8-7756367fe0bf-kube-api-access-pzsrb\") pod \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\" (UID: \"4f232979-ab9c-4b59-8ad8-7756367fe0bf\") " Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.803506 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f232979-ab9c-4b59-8ad8-7756367fe0bf-kube-api-access-pzsrb" (OuterVolumeSpecName: "kube-api-access-pzsrb") pod "4f232979-ab9c-4b59-8ad8-7756367fe0bf" (UID: "4f232979-ab9c-4b59-8ad8-7756367fe0bf"). InnerVolumeSpecName "kube-api-access-pzsrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.817228 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8" (OuterVolumeSpecName: "mariadb-data") pod "4f232979-ab9c-4b59-8ad8-7756367fe0bf" (UID: "4f232979-ab9c-4b59-8ad8-7756367fe0bf"). InnerVolumeSpecName "pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.905582 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzsrb\" (UniqueName: \"kubernetes.io/projected/4f232979-ab9c-4b59-8ad8-7756367fe0bf-kube-api-access-pzsrb\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.905639 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\") on node \"crc\" " Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.931615 4795 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 23:53:50 crc kubenswrapper[4795]: I0219 23:53:50.931782 4795 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8") on node "crc" Feb 19 23:53:51 crc kubenswrapper[4795]: I0219 23:53:51.007231 4795 reconciler_common.go:293] "Volume detached for volume \"pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6dccae3-37b8-4abc-8831-9e7ebad3b6a8\") on node \"crc\" DevicePath \"\"" Feb 19 23:53:51 crc kubenswrapper[4795]: I0219 23:53:51.537131 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"4f232979-ab9c-4b59-8ad8-7756367fe0bf","Type":"ContainerDied","Data":"6654d86759c1aab8319afbb64f442570ab81fe83e940c47c9e22d533fa8c1665"} Feb 19 23:53:51 crc kubenswrapper[4795]: I0219 23:53:51.537217 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 19 23:53:51 crc kubenswrapper[4795]: I0219 23:53:51.537500 4795 scope.go:117] "RemoveContainer" containerID="d99c65ed2cf2832977c62a47557eeea8eec734877d891b4ac4fe2f4a681f7224" Feb 19 23:53:51 crc kubenswrapper[4795]: I0219 23:53:51.604637 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 19 23:53:51 crc kubenswrapper[4795]: I0219 23:53:51.623998 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Feb 19 23:53:52 crc kubenswrapper[4795]: I0219 23:53:52.362989 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 19 23:53:52 crc kubenswrapper[4795]: I0219 23:53:52.363529 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="59d77cc9-140e-4468-9023-0a973155d290" containerName="adoption" containerID="cri-o://a94a997e126652369a7f539bdbf820b6c97b6808304fc5ab088e5b94e32df40f" gracePeriod=30 Feb 19 23:53:53 crc kubenswrapper[4795]: I0219 23:53:53.525469 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f232979-ab9c-4b59-8ad8-7756367fe0bf" path="/var/lib/kubelet/pods/4f232979-ab9c-4b59-8ad8-7756367fe0bf/volumes" Feb 19 23:54:22 crc kubenswrapper[4795]: I0219 23:54:22.853223 4795 generic.go:334] "Generic (PLEG): container finished" podID="59d77cc9-140e-4468-9023-0a973155d290" containerID="a94a997e126652369a7f539bdbf820b6c97b6808304fc5ab088e5b94e32df40f" exitCode=137 Feb 19 23:54:22 crc kubenswrapper[4795]: I0219 23:54:22.853299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"59d77cc9-140e-4468-9023-0a973155d290","Type":"ContainerDied","Data":"a94a997e126652369a7f539bdbf820b6c97b6808304fc5ab088e5b94e32df40f"} Feb 19 23:54:22 crc kubenswrapper[4795]: I0219 23:54:22.853615 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"59d77cc9-140e-4468-9023-0a973155d290","Type":"ContainerDied","Data":"d1d8d1a7f24d4b32f5d50dbc56c8c75a39927408e5337eb8542c8ba059a2e9d8"} Feb 19 23:54:22 crc kubenswrapper[4795]: I0219 23:54:22.853631 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1d8d1a7f24d4b32f5d50dbc56c8c75a39927408e5337eb8542c8ba059a2e9d8" Feb 19 23:54:22 crc kubenswrapper[4795]: I0219 23:54:22.940359 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 19 23:54:22 crc kubenswrapper[4795]: I0219 23:54:22.994065 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\") pod \"59d77cc9-140e-4468-9023-0a973155d290\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " Feb 19 23:54:22 crc kubenswrapper[4795]: I0219 23:54:22.994432 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcjlg\" (UniqueName: \"kubernetes.io/projected/59d77cc9-140e-4468-9023-0a973155d290-kube-api-access-jcjlg\") pod \"59d77cc9-140e-4468-9023-0a973155d290\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " Feb 19 23:54:22 crc kubenswrapper[4795]: I0219 23:54:22.994483 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/59d77cc9-140e-4468-9023-0a973155d290-ovn-data-cert\") pod \"59d77cc9-140e-4468-9023-0a973155d290\" (UID: \"59d77cc9-140e-4468-9023-0a973155d290\") " Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.001111 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59d77cc9-140e-4468-9023-0a973155d290-kube-api-access-jcjlg" (OuterVolumeSpecName: "kube-api-access-jcjlg") pod "59d77cc9-140e-4468-9023-0a973155d290" (UID: "59d77cc9-140e-4468-9023-0a973155d290"). InnerVolumeSpecName "kube-api-access-jcjlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.002310 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59d77cc9-140e-4468-9023-0a973155d290-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "59d77cc9-140e-4468-9023-0a973155d290" (UID: "59d77cc9-140e-4468-9023-0a973155d290"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.019289 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42" (OuterVolumeSpecName: "ovn-data") pod "59d77cc9-140e-4468-9023-0a973155d290" (UID: "59d77cc9-140e-4468-9023-0a973155d290"). InnerVolumeSpecName "pvc-fe46f653-3b46-49f1-9da1-d67576b17f42". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.097419 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcjlg\" (UniqueName: \"kubernetes.io/projected/59d77cc9-140e-4468-9023-0a973155d290-kube-api-access-jcjlg\") on node \"crc\" DevicePath \"\"" Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.097469 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/59d77cc9-140e-4468-9023-0a973155d290-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.097525 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\") on node \"crc\" " Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.129353 4795 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.129567 4795 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-fe46f653-3b46-49f1-9da1-d67576b17f42" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42") on node "crc" Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.199741 4795 reconciler_common.go:293] "Volume detached for volume \"pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fe46f653-3b46-49f1-9da1-d67576b17f42\") on node \"crc\" DevicePath \"\"" Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.864956 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.897915 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 19 23:54:23 crc kubenswrapper[4795]: I0219 23:54:23.910395 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Feb 19 23:54:25 crc kubenswrapper[4795]: I0219 23:54:25.522513 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59d77cc9-140e-4468-9023-0a973155d290" path="/var/lib/kubelet/pods/59d77cc9-140e-4468-9023-0a973155d290/volumes" Feb 19 23:54:29 crc kubenswrapper[4795]: I0219 23:54:29.054104 4795 scope.go:117] "RemoveContainer" containerID="a94a997e126652369a7f539bdbf820b6c97b6808304fc5ab088e5b94e32df40f" Feb 19 23:55:28 crc kubenswrapper[4795]: I0219 23:55:28.427485 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:55:28 crc kubenswrapper[4795]: I0219 23:55:28.427936 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.213026 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gv265/must-gather-ltjmp"] Feb 19 23:55:29 crc kubenswrapper[4795]: E0219 23:55:29.213809 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerName="registry-server" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.213832 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerName="registry-server" Feb 19 23:55:29 crc kubenswrapper[4795]: E0219 23:55:29.213858 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d77cc9-140e-4468-9023-0a973155d290" containerName="adoption" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.213865 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d77cc9-140e-4468-9023-0a973155d290" containerName="adoption" Feb 19 23:55:29 crc kubenswrapper[4795]: E0219 23:55:29.213888 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerName="extract-utilities" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.213894 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerName="extract-utilities" Feb 19 23:55:29 crc kubenswrapper[4795]: E0219 23:55:29.213909 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerName="extract-content" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.213914 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerName="extract-content" Feb 19 23:55:29 crc kubenswrapper[4795]: E0219 23:55:29.213928 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f232979-ab9c-4b59-8ad8-7756367fe0bf" containerName="adoption" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.213937 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f232979-ab9c-4b59-8ad8-7756367fe0bf" containerName="adoption" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.214157 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2031ff34-f306-4920-8bb4-a6f0151a9aa3" containerName="registry-server" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.214171 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f232979-ab9c-4b59-8ad8-7756367fe0bf" containerName="adoption" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.214204 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="59d77cc9-140e-4468-9023-0a973155d290" containerName="adoption" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.215406 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.217078 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-gv265"/"default-dockercfg-wm7vn" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.217560 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gv265"/"openshift-service-ca.crt" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.217892 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gv265"/"kube-root-ca.crt" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.231447 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gv265/must-gather-ltjmp"] Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.291565 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d06c32e0-d01f-47e9-871b-9fdfb391d796-must-gather-output\") pod \"must-gather-ltjmp\" (UID: \"d06c32e0-d01f-47e9-871b-9fdfb391d796\") " pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.291907 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5kfl\" (UniqueName: \"kubernetes.io/projected/d06c32e0-d01f-47e9-871b-9fdfb391d796-kube-api-access-g5kfl\") pod \"must-gather-ltjmp\" (UID: \"d06c32e0-d01f-47e9-871b-9fdfb391d796\") " pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.393487 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d06c32e0-d01f-47e9-871b-9fdfb391d796-must-gather-output\") pod \"must-gather-ltjmp\" (UID: \"d06c32e0-d01f-47e9-871b-9fdfb391d796\") " pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.393600 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5kfl\" (UniqueName: \"kubernetes.io/projected/d06c32e0-d01f-47e9-871b-9fdfb391d796-kube-api-access-g5kfl\") pod \"must-gather-ltjmp\" (UID: \"d06c32e0-d01f-47e9-871b-9fdfb391d796\") " pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.394328 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d06c32e0-d01f-47e9-871b-9fdfb391d796-must-gather-output\") pod \"must-gather-ltjmp\" (UID: \"d06c32e0-d01f-47e9-871b-9fdfb391d796\") " pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.727789 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5kfl\" (UniqueName: \"kubernetes.io/projected/d06c32e0-d01f-47e9-871b-9fdfb391d796-kube-api-access-g5kfl\") pod \"must-gather-ltjmp\" (UID: \"d06c32e0-d01f-47e9-871b-9fdfb391d796\") " pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 19 23:55:29 crc kubenswrapper[4795]: I0219 23:55:29.837703 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 19 23:55:30 crc kubenswrapper[4795]: I0219 23:55:30.365286 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gv265/must-gather-ltjmp"] Feb 19 23:55:30 crc kubenswrapper[4795]: I0219 23:55:30.553824 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/must-gather-ltjmp" event={"ID":"d06c32e0-d01f-47e9-871b-9fdfb391d796","Type":"ContainerStarted","Data":"5fd8d3d5525d94b61cf12ae80a853096d968888e1ccdce6628c381ab590f1eb7"} Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.735534 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sz6pt"] Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.755434 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.776326 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sz6pt"] Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.830243 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-utilities\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.830314 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-catalog-content\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.830411 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trfkf\" (UniqueName: \"kubernetes.io/projected/8f2da7b5-d729-463f-9589-455203a5ad9e-kube-api-access-trfkf\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.931968 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-utilities\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.932045 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-catalog-content\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.932218 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trfkf\" (UniqueName: \"kubernetes.io/projected/8f2da7b5-d729-463f-9589-455203a5ad9e-kube-api-access-trfkf\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.932883 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-utilities\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.932966 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-catalog-content\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:35 crc kubenswrapper[4795]: I0219 23:55:35.959396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trfkf\" (UniqueName: \"kubernetes.io/projected/8f2da7b5-d729-463f-9589-455203a5ad9e-kube-api-access-trfkf\") pod \"redhat-operators-sz6pt\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:36 crc kubenswrapper[4795]: I0219 23:55:36.109732 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:37 crc kubenswrapper[4795]: W0219 23:55:37.295407 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f2da7b5_d729_463f_9589_455203a5ad9e.slice/crio-a6d95aca702893e3ea5637611a6e798f2de86b26f77a8aeaf0d2c93be967e572 WatchSource:0}: Error finding container a6d95aca702893e3ea5637611a6e798f2de86b26f77a8aeaf0d2c93be967e572: Status 404 returned error can't find the container with id a6d95aca702893e3ea5637611a6e798f2de86b26f77a8aeaf0d2c93be967e572 Feb 19 23:55:37 crc kubenswrapper[4795]: I0219 23:55:37.297352 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sz6pt"] Feb 19 23:55:37 crc kubenswrapper[4795]: I0219 23:55:37.626560 4795 generic.go:334] "Generic (PLEG): container finished" podID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerID="62deb3f2bf568b395b46bc81de81ee26457a95f02f006897224aa24bc7dac271" exitCode=0 Feb 19 23:55:37 crc kubenswrapper[4795]: I0219 23:55:37.626614 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6pt" event={"ID":"8f2da7b5-d729-463f-9589-455203a5ad9e","Type":"ContainerDied","Data":"62deb3f2bf568b395b46bc81de81ee26457a95f02f006897224aa24bc7dac271"} Feb 19 23:55:37 crc kubenswrapper[4795]: I0219 23:55:37.626870 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6pt" event={"ID":"8f2da7b5-d729-463f-9589-455203a5ad9e","Type":"ContainerStarted","Data":"a6d95aca702893e3ea5637611a6e798f2de86b26f77a8aeaf0d2c93be967e572"} Feb 19 23:55:37 crc kubenswrapper[4795]: I0219 23:55:37.629597 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/must-gather-ltjmp" event={"ID":"d06c32e0-d01f-47e9-871b-9fdfb391d796","Type":"ContainerStarted","Data":"340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70"} Feb 19 23:55:37 crc kubenswrapper[4795]: I0219 23:55:37.629638 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/must-gather-ltjmp" event={"ID":"d06c32e0-d01f-47e9-871b-9fdfb391d796","Type":"ContainerStarted","Data":"35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5"} Feb 19 23:55:37 crc kubenswrapper[4795]: I0219 23:55:37.666379 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gv265/must-gather-ltjmp" podStartSLOduration=2.196674599 podStartE2EDuration="8.666358667s" podCreationTimestamp="2026-02-19 23:55:29 +0000 UTC" firstStartedPulling="2026-02-19 23:55:30.371206465 +0000 UTC m=+8841.563724329" lastFinishedPulling="2026-02-19 23:55:36.840890543 +0000 UTC m=+8848.033408397" observedRunningTime="2026-02-19 23:55:37.657763495 +0000 UTC m=+8848.850281359" watchObservedRunningTime="2026-02-19 23:55:37.666358667 +0000 UTC m=+8848.858876531" Feb 19 23:55:39 crc kubenswrapper[4795]: I0219 23:55:39.654472 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6pt" event={"ID":"8f2da7b5-d729-463f-9589-455203a5ad9e","Type":"ContainerStarted","Data":"07a4317117426bae86bf07c8d5e610095ccbae15e950025068b03dd800e0da4a"} Feb 19 23:55:41 crc kubenswrapper[4795]: I0219 23:55:41.842344 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gv265/crc-debug-htwdp"] Feb 19 23:55:41 crc kubenswrapper[4795]: I0219 23:55:41.844223 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:55:41 crc kubenswrapper[4795]: I0219 23:55:41.969589 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0993101a-b7bc-40fd-a75c-9d6eefe49025-host\") pod \"crc-debug-htwdp\" (UID: \"0993101a-b7bc-40fd-a75c-9d6eefe49025\") " pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:55:41 crc kubenswrapper[4795]: I0219 23:55:41.969773 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbpz9\" (UniqueName: \"kubernetes.io/projected/0993101a-b7bc-40fd-a75c-9d6eefe49025-kube-api-access-hbpz9\") pod \"crc-debug-htwdp\" (UID: \"0993101a-b7bc-40fd-a75c-9d6eefe49025\") " pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:55:42 crc kubenswrapper[4795]: I0219 23:55:42.076763 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0993101a-b7bc-40fd-a75c-9d6eefe49025-host\") pod \"crc-debug-htwdp\" (UID: \"0993101a-b7bc-40fd-a75c-9d6eefe49025\") " pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:55:42 crc kubenswrapper[4795]: I0219 23:55:42.076877 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0993101a-b7bc-40fd-a75c-9d6eefe49025-host\") pod \"crc-debug-htwdp\" (UID: \"0993101a-b7bc-40fd-a75c-9d6eefe49025\") " pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:55:42 crc kubenswrapper[4795]: I0219 23:55:42.077113 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbpz9\" (UniqueName: \"kubernetes.io/projected/0993101a-b7bc-40fd-a75c-9d6eefe49025-kube-api-access-hbpz9\") pod \"crc-debug-htwdp\" (UID: \"0993101a-b7bc-40fd-a75c-9d6eefe49025\") " pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:55:42 crc kubenswrapper[4795]: I0219 23:55:42.101145 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbpz9\" (UniqueName: \"kubernetes.io/projected/0993101a-b7bc-40fd-a75c-9d6eefe49025-kube-api-access-hbpz9\") pod \"crc-debug-htwdp\" (UID: \"0993101a-b7bc-40fd-a75c-9d6eefe49025\") " pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:55:42 crc kubenswrapper[4795]: I0219 23:55:42.164820 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:55:42 crc kubenswrapper[4795]: W0219 23:55:42.212894 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0993101a_b7bc_40fd_a75c_9d6eefe49025.slice/crio-a07f1a9d217d7811f21420408d2460fcdb1ae5287bd4e9acf1ba0f20fa813088 WatchSource:0}: Error finding container a07f1a9d217d7811f21420408d2460fcdb1ae5287bd4e9acf1ba0f20fa813088: Status 404 returned error can't find the container with id a07f1a9d217d7811f21420408d2460fcdb1ae5287bd4e9acf1ba0f20fa813088 Feb 19 23:55:42 crc kubenswrapper[4795]: I0219 23:55:42.680011 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/crc-debug-htwdp" event={"ID":"0993101a-b7bc-40fd-a75c-9d6eefe49025","Type":"ContainerStarted","Data":"a07f1a9d217d7811f21420408d2460fcdb1ae5287bd4e9acf1ba0f20fa813088"} Feb 19 23:55:43 crc kubenswrapper[4795]: I0219 23:55:43.691045 4795 generic.go:334] "Generic (PLEG): container finished" podID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerID="07a4317117426bae86bf07c8d5e610095ccbae15e950025068b03dd800e0da4a" exitCode=0 Feb 19 23:55:43 crc kubenswrapper[4795]: I0219 23:55:43.691143 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6pt" event={"ID":"8f2da7b5-d729-463f-9589-455203a5ad9e","Type":"ContainerDied","Data":"07a4317117426bae86bf07c8d5e610095ccbae15e950025068b03dd800e0da4a"} Feb 19 23:55:44 crc kubenswrapper[4795]: I0219 23:55:44.702225 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6pt" event={"ID":"8f2da7b5-d729-463f-9589-455203a5ad9e","Type":"ContainerStarted","Data":"6d065a9d149243251b6b58ebaf53b077b0910d5423b5a4944782ee17e37027c2"} Feb 19 23:55:44 crc kubenswrapper[4795]: I0219 23:55:44.733667 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sz6pt" podStartSLOduration=3.3004902019999998 podStartE2EDuration="9.733652725s" podCreationTimestamp="2026-02-19 23:55:35 +0000 UTC" firstStartedPulling="2026-02-19 23:55:37.628940747 +0000 UTC m=+8848.821458601" lastFinishedPulling="2026-02-19 23:55:44.06210326 +0000 UTC m=+8855.254621124" observedRunningTime="2026-02-19 23:55:44.728475889 +0000 UTC m=+8855.920993743" watchObservedRunningTime="2026-02-19 23:55:44.733652725 +0000 UTC m=+8855.926170579" Feb 19 23:55:46 crc kubenswrapper[4795]: I0219 23:55:46.110602 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:46 crc kubenswrapper[4795]: I0219 23:55:46.110858 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:55:47 crc kubenswrapper[4795]: I0219 23:55:47.197852 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sz6pt" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="registry-server" probeResult="failure" output=< Feb 19 23:55:47 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:55:47 crc kubenswrapper[4795]: > Feb 19 23:55:54 crc kubenswrapper[4795]: I0219 23:55:54.796381 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/crc-debug-htwdp" event={"ID":"0993101a-b7bc-40fd-a75c-9d6eefe49025","Type":"ContainerStarted","Data":"de244572ed412b6d15883ac7be00a681de7409d595350d300ebc7aac8f96c690"} Feb 19 23:55:54 crc kubenswrapper[4795]: I0219 23:55:54.818157 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gv265/crc-debug-htwdp" podStartSLOduration=1.7409129650000001 podStartE2EDuration="13.818138079s" podCreationTimestamp="2026-02-19 23:55:41 +0000 UTC" firstStartedPulling="2026-02-19 23:55:42.215635965 +0000 UTC m=+8853.408153839" lastFinishedPulling="2026-02-19 23:55:54.292861099 +0000 UTC m=+8865.485378953" observedRunningTime="2026-02-19 23:55:54.810976878 +0000 UTC m=+8866.003494742" watchObservedRunningTime="2026-02-19 23:55:54.818138079 +0000 UTC m=+8866.010655943" Feb 19 23:55:57 crc kubenswrapper[4795]: I0219 23:55:57.165275 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sz6pt" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="registry-server" probeResult="failure" output=< Feb 19 23:55:57 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:55:57 crc kubenswrapper[4795]: > Feb 19 23:55:58 crc kubenswrapper[4795]: I0219 23:55:58.427532 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:55:58 crc kubenswrapper[4795]: I0219 23:55:58.427849 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:56:07 crc kubenswrapper[4795]: I0219 23:56:07.157502 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sz6pt" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="registry-server" probeResult="failure" output=< Feb 19 23:56:07 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:56:07 crc kubenswrapper[4795]: > Feb 19 23:56:17 crc kubenswrapper[4795]: I0219 23:56:17.155780 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sz6pt" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="registry-server" probeResult="failure" output=< Feb 19 23:56:17 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 19 23:56:17 crc kubenswrapper[4795]: > Feb 19 23:56:21 crc kubenswrapper[4795]: I0219 23:56:21.088643 4795 generic.go:334] "Generic (PLEG): container finished" podID="0993101a-b7bc-40fd-a75c-9d6eefe49025" containerID="de244572ed412b6d15883ac7be00a681de7409d595350d300ebc7aac8f96c690" exitCode=0 Feb 19 23:56:21 crc kubenswrapper[4795]: I0219 23:56:21.088720 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/crc-debug-htwdp" event={"ID":"0993101a-b7bc-40fd-a75c-9d6eefe49025","Type":"ContainerDied","Data":"de244572ed412b6d15883ac7be00a681de7409d595350d300ebc7aac8f96c690"} Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.265539 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.314532 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gv265/crc-debug-htwdp"] Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.332857 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gv265/crc-debug-htwdp"] Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.413257 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbpz9\" (UniqueName: \"kubernetes.io/projected/0993101a-b7bc-40fd-a75c-9d6eefe49025-kube-api-access-hbpz9\") pod \"0993101a-b7bc-40fd-a75c-9d6eefe49025\" (UID: \"0993101a-b7bc-40fd-a75c-9d6eefe49025\") " Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.413517 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0993101a-b7bc-40fd-a75c-9d6eefe49025-host\") pod \"0993101a-b7bc-40fd-a75c-9d6eefe49025\" (UID: \"0993101a-b7bc-40fd-a75c-9d6eefe49025\") " Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.413629 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0993101a-b7bc-40fd-a75c-9d6eefe49025-host" (OuterVolumeSpecName: "host") pod "0993101a-b7bc-40fd-a75c-9d6eefe49025" (UID: "0993101a-b7bc-40fd-a75c-9d6eefe49025"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.414087 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0993101a-b7bc-40fd-a75c-9d6eefe49025-host\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.421958 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0993101a-b7bc-40fd-a75c-9d6eefe49025-kube-api-access-hbpz9" (OuterVolumeSpecName: "kube-api-access-hbpz9") pod "0993101a-b7bc-40fd-a75c-9d6eefe49025" (UID: "0993101a-b7bc-40fd-a75c-9d6eefe49025"). InnerVolumeSpecName "kube-api-access-hbpz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:56:22 crc kubenswrapper[4795]: I0219 23:56:22.516488 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbpz9\" (UniqueName: \"kubernetes.io/projected/0993101a-b7bc-40fd-a75c-9d6eefe49025-kube-api-access-hbpz9\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.107451 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a07f1a9d217d7811f21420408d2460fcdb1ae5287bd4e9acf1ba0f20fa813088" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.107523 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-htwdp" Feb 19 23:56:23 crc kubenswrapper[4795]: E0219 23:56:23.346987 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0993101a_b7bc_40fd_a75c_9d6eefe49025.slice/crio-a07f1a9d217d7811f21420408d2460fcdb1ae5287bd4e9acf1ba0f20fa813088\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0993101a_b7bc_40fd_a75c_9d6eefe49025.slice\": RecentStats: unable to find data in memory cache]" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.502983 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gv265/crc-debug-b2t7k"] Feb 19 23:56:23 crc kubenswrapper[4795]: E0219 23:56:23.503475 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0993101a-b7bc-40fd-a75c-9d6eefe49025" containerName="container-00" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.503494 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0993101a-b7bc-40fd-a75c-9d6eefe49025" containerName="container-00" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.503708 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0993101a-b7bc-40fd-a75c-9d6eefe49025" containerName="container-00" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.504432 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.524402 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0993101a-b7bc-40fd-a75c-9d6eefe49025" path="/var/lib/kubelet/pods/0993101a-b7bc-40fd-a75c-9d6eefe49025/volumes" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.535775 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8080932-0083-4fa5-9816-6f8f4c16c917-host\") pod \"crc-debug-b2t7k\" (UID: \"c8080932-0083-4fa5-9816-6f8f4c16c917\") " pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.536062 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42v45\" (UniqueName: \"kubernetes.io/projected/c8080932-0083-4fa5-9816-6f8f4c16c917-kube-api-access-42v45\") pod \"crc-debug-b2t7k\" (UID: \"c8080932-0083-4fa5-9816-6f8f4c16c917\") " pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.638650 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42v45\" (UniqueName: \"kubernetes.io/projected/c8080932-0083-4fa5-9816-6f8f4c16c917-kube-api-access-42v45\") pod \"crc-debug-b2t7k\" (UID: \"c8080932-0083-4fa5-9816-6f8f4c16c917\") " pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.638754 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8080932-0083-4fa5-9816-6f8f4c16c917-host\") pod \"crc-debug-b2t7k\" (UID: \"c8080932-0083-4fa5-9816-6f8f4c16c917\") " pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.638884 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8080932-0083-4fa5-9816-6f8f4c16c917-host\") pod \"crc-debug-b2t7k\" (UID: \"c8080932-0083-4fa5-9816-6f8f4c16c917\") " pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.657283 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42v45\" (UniqueName: \"kubernetes.io/projected/c8080932-0083-4fa5-9816-6f8f4c16c917-kube-api-access-42v45\") pod \"crc-debug-b2t7k\" (UID: \"c8080932-0083-4fa5-9816-6f8f4c16c917\") " pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:23 crc kubenswrapper[4795]: I0219 23:56:23.822972 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:24 crc kubenswrapper[4795]: I0219 23:56:24.120256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/crc-debug-b2t7k" event={"ID":"c8080932-0083-4fa5-9816-6f8f4c16c917","Type":"ContainerStarted","Data":"3644eb2ff205a6ecbf078ee98cc404ff628a9964f7f066ce4f38a2487488db5f"} Feb 19 23:56:24 crc kubenswrapper[4795]: I0219 23:56:24.120586 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/crc-debug-b2t7k" event={"ID":"c8080932-0083-4fa5-9816-6f8f4c16c917","Type":"ContainerStarted","Data":"0b0224086177e9301d265a5e33b0ab2e6fe6e59ed4f2c42819f86d26d52c4151"} Feb 19 23:56:24 crc kubenswrapper[4795]: I0219 23:56:24.131353 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gv265/crc-debug-b2t7k" podStartSLOduration=1.131334908 podStartE2EDuration="1.131334908s" podCreationTimestamp="2026-02-19 23:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 23:56:24.130533935 +0000 UTC m=+8895.323051799" watchObservedRunningTime="2026-02-19 23:56:24.131334908 +0000 UTC m=+8895.323852772" Feb 19 23:56:25 crc kubenswrapper[4795]: I0219 23:56:25.131258 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8080932-0083-4fa5-9816-6f8f4c16c917" containerID="3644eb2ff205a6ecbf078ee98cc404ff628a9964f7f066ce4f38a2487488db5f" exitCode=0 Feb 19 23:56:25 crc kubenswrapper[4795]: I0219 23:56:25.131359 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/crc-debug-b2t7k" event={"ID":"c8080932-0083-4fa5-9816-6f8f4c16c917","Type":"ContainerDied","Data":"3644eb2ff205a6ecbf078ee98cc404ff628a9964f7f066ce4f38a2487488db5f"} Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.168236 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.236929 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.255699 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.293970 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gv265/crc-debug-b2t7k"] Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.302235 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gv265/crc-debug-b2t7k"] Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.391488 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8080932-0083-4fa5-9816-6f8f4c16c917-host\") pod \"c8080932-0083-4fa5-9816-6f8f4c16c917\" (UID: \"c8080932-0083-4fa5-9816-6f8f4c16c917\") " Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.391615 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42v45\" (UniqueName: \"kubernetes.io/projected/c8080932-0083-4fa5-9816-6f8f4c16c917-kube-api-access-42v45\") pod \"c8080932-0083-4fa5-9816-6f8f4c16c917\" (UID: \"c8080932-0083-4fa5-9816-6f8f4c16c917\") " Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.391847 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c8080932-0083-4fa5-9816-6f8f4c16c917-host" (OuterVolumeSpecName: "host") pod "c8080932-0083-4fa5-9816-6f8f4c16c917" (UID: "c8080932-0083-4fa5-9816-6f8f4c16c917"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.393297 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8080932-0083-4fa5-9816-6f8f4c16c917-host\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.401509 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8080932-0083-4fa5-9816-6f8f4c16c917-kube-api-access-42v45" (OuterVolumeSpecName: "kube-api-access-42v45") pod "c8080932-0083-4fa5-9816-6f8f4c16c917" (UID: "c8080932-0083-4fa5-9816-6f8f4c16c917"). InnerVolumeSpecName "kube-api-access-42v45". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.405890 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sz6pt"] Feb 19 23:56:26 crc kubenswrapper[4795]: I0219 23:56:26.495092 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42v45\" (UniqueName: \"kubernetes.io/projected/c8080932-0083-4fa5-9816-6f8f4c16c917-kube-api-access-42v45\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.150832 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b0224086177e9301d265a5e33b0ab2e6fe6e59ed4f2c42819f86d26d52c4151" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.150870 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-b2t7k" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.523838 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8080932-0083-4fa5-9816-6f8f4c16c917" path="/var/lib/kubelet/pods/c8080932-0083-4fa5-9816-6f8f4c16c917/volumes" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.526914 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gv265/crc-debug-5lxqn"] Feb 19 23:56:27 crc kubenswrapper[4795]: E0219 23:56:27.527413 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8080932-0083-4fa5-9816-6f8f4c16c917" containerName="container-00" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.527495 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8080932-0083-4fa5-9816-6f8f4c16c917" containerName="container-00" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.527787 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8080932-0083-4fa5-9816-6f8f4c16c917" containerName="container-00" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.531879 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.721319 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s26gw\" (UniqueName: \"kubernetes.io/projected/48fd74ed-b96f-4d54-bde4-7812aa6f92be-kube-api-access-s26gw\") pod \"crc-debug-5lxqn\" (UID: \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\") " pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.721471 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48fd74ed-b96f-4d54-bde4-7812aa6f92be-host\") pod \"crc-debug-5lxqn\" (UID: \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\") " pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.823954 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s26gw\" (UniqueName: \"kubernetes.io/projected/48fd74ed-b96f-4d54-bde4-7812aa6f92be-kube-api-access-s26gw\") pod \"crc-debug-5lxqn\" (UID: \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\") " pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.824132 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48fd74ed-b96f-4d54-bde4-7812aa6f92be-host\") pod \"crc-debug-5lxqn\" (UID: \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\") " pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.824285 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48fd74ed-b96f-4d54-bde4-7812aa6f92be-host\") pod \"crc-debug-5lxqn\" (UID: \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\") " pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.846844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s26gw\" (UniqueName: \"kubernetes.io/projected/48fd74ed-b96f-4d54-bde4-7812aa6f92be-kube-api-access-s26gw\") pod \"crc-debug-5lxqn\" (UID: \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\") " pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:27 crc kubenswrapper[4795]: I0219 23:56:27.851787 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:27 crc kubenswrapper[4795]: W0219 23:56:27.884417 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fd74ed_b96f_4d54_bde4_7812aa6f92be.slice/crio-bbe1f7b37dea849a53f62bcadc373fd195c3dd7ff3250a51919068d5d7de11e2 WatchSource:0}: Error finding container bbe1f7b37dea849a53f62bcadc373fd195c3dd7ff3250a51919068d5d7de11e2: Status 404 returned error can't find the container with id bbe1f7b37dea849a53f62bcadc373fd195c3dd7ff3250a51919068d5d7de11e2 Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.161413 4795 generic.go:334] "Generic (PLEG): container finished" podID="48fd74ed-b96f-4d54-bde4-7812aa6f92be" containerID="7aaf83518c533d8a7a22f272741bc623bd1d1cbd9f71a60f196c67d16987e279" exitCode=0 Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.161489 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/crc-debug-5lxqn" event={"ID":"48fd74ed-b96f-4d54-bde4-7812aa6f92be","Type":"ContainerDied","Data":"7aaf83518c533d8a7a22f272741bc623bd1d1cbd9f71a60f196c67d16987e279"} Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.161759 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/crc-debug-5lxqn" event={"ID":"48fd74ed-b96f-4d54-bde4-7812aa6f92be","Type":"ContainerStarted","Data":"bbe1f7b37dea849a53f62bcadc373fd195c3dd7ff3250a51919068d5d7de11e2"} Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.161917 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sz6pt" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="registry-server" containerID="cri-o://6d065a9d149243251b6b58ebaf53b077b0910d5423b5a4944782ee17e37027c2" gracePeriod=2 Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.226729 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gv265/crc-debug-5lxqn"] Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.241479 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gv265/crc-debug-5lxqn"] Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.427235 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.427550 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.427597 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.428396 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"826efb3421363865a6a92aeeb46a1de0d922f1bfcfa12ff4f740f04d52a6b3e6"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:56:28 crc kubenswrapper[4795]: I0219 23:56:28.428453 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://826efb3421363865a6a92aeeb46a1de0d922f1bfcfa12ff4f740f04d52a6b3e6" gracePeriod=600 Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.172254 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"826efb3421363865a6a92aeeb46a1de0d922f1bfcfa12ff4f740f04d52a6b3e6"} Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.172158 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="826efb3421363865a6a92aeeb46a1de0d922f1bfcfa12ff4f740f04d52a6b3e6" exitCode=0 Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.172690 4795 scope.go:117] "RemoveContainer" containerID="8095d75d862e2b693e66115bd0de83928e8a1abe68615b91e580952d99b88f5a" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.176969 4795 generic.go:334] "Generic (PLEG): container finished" podID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerID="6d065a9d149243251b6b58ebaf53b077b0910d5423b5a4944782ee17e37027c2" exitCode=0 Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.177196 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6pt" event={"ID":"8f2da7b5-d729-463f-9589-455203a5ad9e","Type":"ContainerDied","Data":"6d065a9d149243251b6b58ebaf53b077b0910d5423b5a4944782ee17e37027c2"} Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.177224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sz6pt" event={"ID":"8f2da7b5-d729-463f-9589-455203a5ad9e","Type":"ContainerDied","Data":"a6d95aca702893e3ea5637611a6e798f2de86b26f77a8aeaf0d2c93be967e572"} Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.177236 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6d95aca702893e3ea5637611a6e798f2de86b26f77a8aeaf0d2c93be967e572" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.372812 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.387208 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.559568 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s26gw\" (UniqueName: \"kubernetes.io/projected/48fd74ed-b96f-4d54-bde4-7812aa6f92be-kube-api-access-s26gw\") pod \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\" (UID: \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\") " Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.560077 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-catalog-content\") pod \"8f2da7b5-d729-463f-9589-455203a5ad9e\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.560213 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48fd74ed-b96f-4d54-bde4-7812aa6f92be-host\") pod \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\" (UID: \"48fd74ed-b96f-4d54-bde4-7812aa6f92be\") " Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.560254 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48fd74ed-b96f-4d54-bde4-7812aa6f92be-host" (OuterVolumeSpecName: "host") pod "48fd74ed-b96f-4d54-bde4-7812aa6f92be" (UID: "48fd74ed-b96f-4d54-bde4-7812aa6f92be"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.560274 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-utilities\") pod \"8f2da7b5-d729-463f-9589-455203a5ad9e\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.560364 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trfkf\" (UniqueName: \"kubernetes.io/projected/8f2da7b5-d729-463f-9589-455203a5ad9e-kube-api-access-trfkf\") pod \"8f2da7b5-d729-463f-9589-455203a5ad9e\" (UID: \"8f2da7b5-d729-463f-9589-455203a5ad9e\") " Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.560982 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48fd74ed-b96f-4d54-bde4-7812aa6f92be-host\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.561091 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-utilities" (OuterVolumeSpecName: "utilities") pod "8f2da7b5-d729-463f-9589-455203a5ad9e" (UID: "8f2da7b5-d729-463f-9589-455203a5ad9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.567804 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48fd74ed-b96f-4d54-bde4-7812aa6f92be-kube-api-access-s26gw" (OuterVolumeSpecName: "kube-api-access-s26gw") pod "48fd74ed-b96f-4d54-bde4-7812aa6f92be" (UID: "48fd74ed-b96f-4d54-bde4-7812aa6f92be"). InnerVolumeSpecName "kube-api-access-s26gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.588661 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f2da7b5-d729-463f-9589-455203a5ad9e-kube-api-access-trfkf" (OuterVolumeSpecName: "kube-api-access-trfkf") pod "8f2da7b5-d729-463f-9589-455203a5ad9e" (UID: "8f2da7b5-d729-463f-9589-455203a5ad9e"). InnerVolumeSpecName "kube-api-access-trfkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.662681 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s26gw\" (UniqueName: \"kubernetes.io/projected/48fd74ed-b96f-4d54-bde4-7812aa6f92be-kube-api-access-s26gw\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.662713 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.662723 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trfkf\" (UniqueName: \"kubernetes.io/projected/8f2da7b5-d729-463f-9589-455203a5ad9e-kube-api-access-trfkf\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.680212 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f2da7b5-d729-463f-9589-455203a5ad9e" (UID: "8f2da7b5-d729-463f-9589-455203a5ad9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:56:29 crc kubenswrapper[4795]: I0219 23:56:29.764355 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f2da7b5-d729-463f-9589-455203a5ad9e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:56:30 crc kubenswrapper[4795]: I0219 23:56:30.187394 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/crc-debug-5lxqn" Feb 19 23:56:30 crc kubenswrapper[4795]: I0219 23:56:30.187400 4795 scope.go:117] "RemoveContainer" containerID="7aaf83518c533d8a7a22f272741bc623bd1d1cbd9f71a60f196c67d16987e279" Feb 19 23:56:30 crc kubenswrapper[4795]: I0219 23:56:30.191002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7"} Feb 19 23:56:30 crc kubenswrapper[4795]: I0219 23:56:30.191047 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sz6pt" Feb 19 23:56:30 crc kubenswrapper[4795]: I0219 23:56:30.279359 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sz6pt"] Feb 19 23:56:30 crc kubenswrapper[4795]: I0219 23:56:30.293827 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sz6pt"] Feb 19 23:56:31 crc kubenswrapper[4795]: I0219 23:56:31.529143 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48fd74ed-b96f-4d54-bde4-7812aa6f92be" path="/var/lib/kubelet/pods/48fd74ed-b96f-4d54-bde4-7812aa6f92be/volumes" Feb 19 23:56:31 crc kubenswrapper[4795]: I0219 23:56:31.530157 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" path="/var/lib/kubelet/pods/8f2da7b5-d729-463f-9589-455203a5ad9e/volumes" Feb 19 23:57:48 crc kubenswrapper[4795]: I0219 23:57:48.987680 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c5b66"] Feb 19 23:57:48 crc kubenswrapper[4795]: E0219 23:57:48.989211 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="extract-utilities" Feb 19 23:57:48 crc kubenswrapper[4795]: I0219 23:57:48.989229 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="extract-utilities" Feb 19 23:57:48 crc kubenswrapper[4795]: E0219 23:57:48.989243 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="extract-content" Feb 19 23:57:48 crc kubenswrapper[4795]: I0219 23:57:48.989250 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="extract-content" Feb 19 23:57:48 crc kubenswrapper[4795]: E0219 23:57:48.989296 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48fd74ed-b96f-4d54-bde4-7812aa6f92be" containerName="container-00" Feb 19 23:57:48 crc kubenswrapper[4795]: I0219 23:57:48.989303 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fd74ed-b96f-4d54-bde4-7812aa6f92be" containerName="container-00" Feb 19 23:57:48 crc kubenswrapper[4795]: E0219 23:57:48.989322 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="registry-server" Feb 19 23:57:48 crc kubenswrapper[4795]: I0219 23:57:48.989329 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="registry-server" Feb 19 23:57:48 crc kubenswrapper[4795]: I0219 23:57:48.989571 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="48fd74ed-b96f-4d54-bde4-7812aa6f92be" containerName="container-00" Feb 19 23:57:48 crc kubenswrapper[4795]: I0219 23:57:48.989600 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f2da7b5-d729-463f-9589-455203a5ad9e" containerName="registry-server" Feb 19 23:57:48 crc kubenswrapper[4795]: I0219 23:57:48.991785 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.019081 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5b66"] Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.044742 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-utilities\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.044803 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-888bn\" (UniqueName: \"kubernetes.io/projected/a2382916-87a1-4ef9-9461-45b6ab2b24a3-kube-api-access-888bn\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.044988 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-catalog-content\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.146411 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-catalog-content\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.146531 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-utilities\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.146589 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-888bn\" (UniqueName: \"kubernetes.io/projected/a2382916-87a1-4ef9-9461-45b6ab2b24a3-kube-api-access-888bn\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.146886 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-catalog-content\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.147042 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-utilities\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.166622 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-888bn\" (UniqueName: \"kubernetes.io/projected/a2382916-87a1-4ef9-9461-45b6ab2b24a3-kube-api-access-888bn\") pod \"redhat-marketplace-c5b66\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.320848 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:49 crc kubenswrapper[4795]: I0219 23:57:49.794733 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5b66"] Feb 19 23:57:50 crc kubenswrapper[4795]: I0219 23:57:50.036501 4795 generic.go:334] "Generic (PLEG): container finished" podID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerID="f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9" exitCode=0 Feb 19 23:57:50 crc kubenswrapper[4795]: I0219 23:57:50.036542 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5b66" event={"ID":"a2382916-87a1-4ef9-9461-45b6ab2b24a3","Type":"ContainerDied","Data":"f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9"} Feb 19 23:57:50 crc kubenswrapper[4795]: I0219 23:57:50.036574 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5b66" event={"ID":"a2382916-87a1-4ef9-9461-45b6ab2b24a3","Type":"ContainerStarted","Data":"a0f70866ad3d1dff70355360c8352691af4dac5cdb64e7815192fee5af7a7b5f"} Feb 19 23:57:51 crc kubenswrapper[4795]: I0219 23:57:51.049926 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5b66" event={"ID":"a2382916-87a1-4ef9-9461-45b6ab2b24a3","Type":"ContainerStarted","Data":"c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d"} Feb 19 23:57:52 crc kubenswrapper[4795]: I0219 23:57:52.071908 4795 generic.go:334] "Generic (PLEG): container finished" podID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerID="c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d" exitCode=0 Feb 19 23:57:52 crc kubenswrapper[4795]: I0219 23:57:52.071966 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5b66" event={"ID":"a2382916-87a1-4ef9-9461-45b6ab2b24a3","Type":"ContainerDied","Data":"c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d"} Feb 19 23:57:53 crc kubenswrapper[4795]: I0219 23:57:53.083698 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5b66" event={"ID":"a2382916-87a1-4ef9-9461-45b6ab2b24a3","Type":"ContainerStarted","Data":"5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f"} Feb 19 23:57:53 crc kubenswrapper[4795]: I0219 23:57:53.106458 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c5b66" podStartSLOduration=2.668373126 podStartE2EDuration="5.106442149s" podCreationTimestamp="2026-02-19 23:57:48 +0000 UTC" firstStartedPulling="2026-02-19 23:57:50.038714508 +0000 UTC m=+8981.231232372" lastFinishedPulling="2026-02-19 23:57:52.476783521 +0000 UTC m=+8983.669301395" observedRunningTime="2026-02-19 23:57:53.101364496 +0000 UTC m=+8984.293882350" watchObservedRunningTime="2026-02-19 23:57:53.106442149 +0000 UTC m=+8984.298960013" Feb 19 23:57:59 crc kubenswrapper[4795]: I0219 23:57:59.321597 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:59 crc kubenswrapper[4795]: I0219 23:57:59.322231 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:57:59 crc kubenswrapper[4795]: I0219 23:57:59.378283 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:58:00 crc kubenswrapper[4795]: I0219 23:58:00.200948 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:58:00 crc kubenswrapper[4795]: I0219 23:58:00.259414 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5b66"] Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.173134 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c5b66" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerName="registry-server" containerID="cri-o://5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f" gracePeriod=2 Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.656403 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.742572 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-utilities\") pod \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.742852 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-888bn\" (UniqueName: \"kubernetes.io/projected/a2382916-87a1-4ef9-9461-45b6ab2b24a3-kube-api-access-888bn\") pod \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.742955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-catalog-content\") pod \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\" (UID: \"a2382916-87a1-4ef9-9461-45b6ab2b24a3\") " Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.743467 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-utilities" (OuterVolumeSpecName: "utilities") pod "a2382916-87a1-4ef9-9461-45b6ab2b24a3" (UID: "a2382916-87a1-4ef9-9461-45b6ab2b24a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.743578 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.751347 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2382916-87a1-4ef9-9461-45b6ab2b24a3-kube-api-access-888bn" (OuterVolumeSpecName: "kube-api-access-888bn") pod "a2382916-87a1-4ef9-9461-45b6ab2b24a3" (UID: "a2382916-87a1-4ef9-9461-45b6ab2b24a3"). InnerVolumeSpecName "kube-api-access-888bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.780716 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2382916-87a1-4ef9-9461-45b6ab2b24a3" (UID: "a2382916-87a1-4ef9-9461-45b6ab2b24a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.846302 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-888bn\" (UniqueName: \"kubernetes.io/projected/a2382916-87a1-4ef9-9461-45b6ab2b24a3-kube-api-access-888bn\") on node \"crc\" DevicePath \"\"" Feb 19 23:58:02 crc kubenswrapper[4795]: I0219 23:58:02.846343 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2382916-87a1-4ef9-9461-45b6ab2b24a3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.184781 4795 generic.go:334] "Generic (PLEG): container finished" podID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerID="5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f" exitCode=0 Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.184839 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5b66" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.184857 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5b66" event={"ID":"a2382916-87a1-4ef9-9461-45b6ab2b24a3","Type":"ContainerDied","Data":"5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f"} Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.187628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5b66" event={"ID":"a2382916-87a1-4ef9-9461-45b6ab2b24a3","Type":"ContainerDied","Data":"a0f70866ad3d1dff70355360c8352691af4dac5cdb64e7815192fee5af7a7b5f"} Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.187676 4795 scope.go:117] "RemoveContainer" containerID="5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.223842 4795 scope.go:117] "RemoveContainer" containerID="c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.234809 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5b66"] Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.247927 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5b66"] Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.252495 4795 scope.go:117] "RemoveContainer" containerID="f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.293618 4795 scope.go:117] "RemoveContainer" containerID="5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f" Feb 19 23:58:03 crc kubenswrapper[4795]: E0219 23:58:03.294197 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f\": container with ID starting with 5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f not found: ID does not exist" containerID="5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.294258 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f"} err="failed to get container status \"5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f\": rpc error: code = NotFound desc = could not find container \"5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f\": container with ID starting with 5bfe7779c9984cbb85e962016c59906ad2aaeafe4076d4ea66c30a52dc18ac0f not found: ID does not exist" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.294294 4795 scope.go:117] "RemoveContainer" containerID="c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d" Feb 19 23:58:03 crc kubenswrapper[4795]: E0219 23:58:03.297368 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d\": container with ID starting with c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d not found: ID does not exist" containerID="c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.297406 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d"} err="failed to get container status \"c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d\": rpc error: code = NotFound desc = could not find container \"c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d\": container with ID starting with c559a71b5fbc7d3a6fc6440c6f68da288b127e1ac01f0253718d300fc1931f7d not found: ID does not exist" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.297429 4795 scope.go:117] "RemoveContainer" containerID="f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9" Feb 19 23:58:03 crc kubenswrapper[4795]: E0219 23:58:03.297874 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9\": container with ID starting with f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9 not found: ID does not exist" containerID="f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.297930 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9"} err="failed to get container status \"f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9\": rpc error: code = NotFound desc = could not find container \"f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9\": container with ID starting with f5007b1b1946e5b94234d84aa0f061d3ecb4a8cec36f1001b7a98c4dbc215ab9 not found: ID does not exist" Feb 19 23:58:03 crc kubenswrapper[4795]: I0219 23:58:03.522937 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" path="/var/lib/kubelet/pods/a2382916-87a1-4ef9-9461-45b6ab2b24a3/volumes" Feb 19 23:58:58 crc kubenswrapper[4795]: I0219 23:58:58.427120 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:58:58 crc kubenswrapper[4795]: I0219 23:58:58.427754 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:59:28 crc kubenswrapper[4795]: I0219 23:59:28.427638 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:59:28 crc kubenswrapper[4795]: I0219 23:59:28.428252 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:59:58 crc kubenswrapper[4795]: I0219 23:59:58.427428 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 23:59:58 crc kubenswrapper[4795]: I0219 23:59:58.428155 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 23:59:58 crc kubenswrapper[4795]: I0219 23:59:58.428225 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 19 23:59:58 crc kubenswrapper[4795]: I0219 23:59:58.429440 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 23:59:58 crc kubenswrapper[4795]: I0219 23:59:58.429532 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" gracePeriod=600 Feb 19 23:59:58 crc kubenswrapper[4795]: E0219 23:59:58.861620 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 19 23:59:59 crc kubenswrapper[4795]: I0219 23:59:59.716516 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" exitCode=0 Feb 19 23:59:59 crc kubenswrapper[4795]: I0219 23:59:59.716585 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7"} Feb 19 23:59:59 crc kubenswrapper[4795]: I0219 23:59:59.716892 4795 scope.go:117] "RemoveContainer" containerID="826efb3421363865a6a92aeeb46a1de0d922f1bfcfa12ff4f740f04d52a6b3e6" Feb 19 23:59:59 crc kubenswrapper[4795]: I0219 23:59:59.718345 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 19 23:59:59 crc kubenswrapper[4795]: E0219 23:59:59.719190 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.153360 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-purge-29525760-9s2b5"] Feb 20 00:00:00 crc kubenswrapper[4795]: E0220 00:00:00.154396 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerName="extract-utilities" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.154419 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerName="extract-utilities" Feb 20 00:00:00 crc kubenswrapper[4795]: E0220 00:00:00.154446 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerName="extract-content" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.154453 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerName="extract-content" Feb 20 00:00:00 crc kubenswrapper[4795]: E0220 00:00:00.154477 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerName="registry-server" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.154485 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerName="registry-server" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.154678 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2382916-87a1-4ef9-9461-45b6ab2b24a3" containerName="registry-server" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.155565 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.157722 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.168234 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-purge-29525760-n9sd6"] Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.170082 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.172601 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.181849 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-purge-29525760-n9sd6"] Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.194573 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-purge-29525760-9s2b5"] Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.263480 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29525760-mtv5d"] Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.265010 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.274567 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.274826 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.296791 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29525760-mtv5d"] Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.319422 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-scripts\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.319549 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-scripts\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.319700 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmg5x\" (UniqueName: \"kubernetes.io/projected/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-kube-api-access-qmg5x\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.319927 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-combined-ca-bundle\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.320029 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwmfj\" (UniqueName: \"kubernetes.io/projected/57de3f43-e33f-4734-b02d-372d013b7e80-kube-api-access-qwmfj\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.320129 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-config-data\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.320228 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-config-data\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.320262 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-combined-ca-bundle\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.352897 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx"] Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.354532 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.364319 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.365415 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.388157 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx"] Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422109 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-scripts\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422223 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmg5x\" (UniqueName: \"kubernetes.io/projected/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-kube-api-access-qmg5x\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-combined-ca-bundle\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422339 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwmfj\" (UniqueName: \"kubernetes.io/projected/57de3f43-e33f-4734-b02d-372d013b7e80-kube-api-access-qwmfj\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422378 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-858m8\" (UniqueName: \"kubernetes.io/projected/29390c83-c5f7-4c7a-8f48-9a02661a1108-kube-api-access-858m8\") pod \"image-pruner-29525760-mtv5d\" (UID: \"29390c83-c5f7-4c7a-8f48-9a02661a1108\") " pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422407 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-config-data\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422427 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-config-data\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422454 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-combined-ca-bundle\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422528 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-scripts\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.422553 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29390c83-c5f7-4c7a-8f48-9a02661a1108-serviceca\") pod \"image-pruner-29525760-mtv5d\" (UID: \"29390c83-c5f7-4c7a-8f48-9a02661a1108\") " pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.428893 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-config-data\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.430236 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-combined-ca-bundle\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.430338 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-scripts\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.431982 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-combined-ca-bundle\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.432296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-config-data\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.438590 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-scripts\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.441550 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmg5x\" (UniqueName: \"kubernetes.io/projected/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-kube-api-access-qmg5x\") pod \"nova-cell1-db-purge-29525760-9s2b5\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.443686 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwmfj\" (UniqueName: \"kubernetes.io/projected/57de3f43-e33f-4734-b02d-372d013b7e80-kube-api-access-qwmfj\") pod \"nova-cell0-db-purge-29525760-n9sd6\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.524638 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29390c83-c5f7-4c7a-8f48-9a02661a1108-serviceca\") pod \"image-pruner-29525760-mtv5d\" (UID: \"29390c83-c5f7-4c7a-8f48-9a02661a1108\") " pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.525056 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7809862b-911c-4763-af00-a74f3fbf2500-config-volume\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.525195 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7809862b-911c-4763-af00-a74f3fbf2500-secret-volume\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.525532 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29390c83-c5f7-4c7a-8f48-9a02661a1108-serviceca\") pod \"image-pruner-29525760-mtv5d\" (UID: \"29390c83-c5f7-4c7a-8f48-9a02661a1108\") " pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.526841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-858m8\" (UniqueName: \"kubernetes.io/projected/29390c83-c5f7-4c7a-8f48-9a02661a1108-kube-api-access-858m8\") pod \"image-pruner-29525760-mtv5d\" (UID: \"29390c83-c5f7-4c7a-8f48-9a02661a1108\") " pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.527428 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfxmk\" (UniqueName: \"kubernetes.io/projected/7809862b-911c-4763-af00-a74f3fbf2500-kube-api-access-sfxmk\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.544465 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-858m8\" (UniqueName: \"kubernetes.io/projected/29390c83-c5f7-4c7a-8f48-9a02661a1108-kube-api-access-858m8\") pod \"image-pruner-29525760-mtv5d\" (UID: \"29390c83-c5f7-4c7a-8f48-9a02661a1108\") " pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.568694 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.580246 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.597061 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.631812 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7809862b-911c-4763-af00-a74f3fbf2500-secret-volume\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.632865 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfxmk\" (UniqueName: \"kubernetes.io/projected/7809862b-911c-4763-af00-a74f3fbf2500-kube-api-access-sfxmk\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.633204 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7809862b-911c-4763-af00-a74f3fbf2500-config-volume\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.649762 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7809862b-911c-4763-af00-a74f3fbf2500-config-volume\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.650015 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7809862b-911c-4763-af00-a74f3fbf2500-secret-volume\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.653585 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfxmk\" (UniqueName: \"kubernetes.io/projected/7809862b-911c-4763-af00-a74f3fbf2500-kube-api-access-sfxmk\") pod \"collect-profiles-29525760-rlfxx\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:00 crc kubenswrapper[4795]: I0220 00:00:00.703512 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:01 crc kubenswrapper[4795]: W0220 00:00:01.097389 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fbbf12e_019a_40d4_9a07_46b3e5b4c814.slice/crio-a0cc8c7bdae191da08aaf6f8bc8f70504815a88c4a974a522c18d8e74303ae0a WatchSource:0}: Error finding container a0cc8c7bdae191da08aaf6f8bc8f70504815a88c4a974a522c18d8e74303ae0a: Status 404 returned error can't find the container with id a0cc8c7bdae191da08aaf6f8bc8f70504815a88c4a974a522c18d8e74303ae0a Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.098764 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-purge-29525760-9s2b5"] Feb 20 00:00:01 crc kubenswrapper[4795]: W0220 00:00:01.102271 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57de3f43_e33f_4734_b02d_372d013b7e80.slice/crio-0aee228d062bbddc74795824664752ec2183d082ae8088e128256da56508fb59 WatchSource:0}: Error finding container 0aee228d062bbddc74795824664752ec2183d082ae8088e128256da56508fb59: Status 404 returned error can't find the container with id 0aee228d062bbddc74795824664752ec2183d082ae8088e128256da56508fb59 Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.108521 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-purge-29525760-n9sd6"] Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.261887 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29525760-mtv5d"] Feb 20 00:00:01 crc kubenswrapper[4795]: W0220 00:00:01.267700 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29390c83_c5f7_4c7a_8f48_9a02661a1108.slice/crio-671c229fbeb3efe9e21c8feda58904a1ee23355307111ba6fa2f34dcfa0ca6c1 WatchSource:0}: Error finding container 671c229fbeb3efe9e21c8feda58904a1ee23355307111ba6fa2f34dcfa0ca6c1: Status 404 returned error can't find the container with id 671c229fbeb3efe9e21c8feda58904a1ee23355307111ba6fa2f34dcfa0ca6c1 Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.389503 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx"] Feb 20 00:00:01 crc kubenswrapper[4795]: W0220 00:00:01.401927 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7809862b_911c_4763_af00_a74f3fbf2500.slice/crio-9805d9298092525a75c78bd99ca910a4bf91a90017b08307551acb4710ef0f68 WatchSource:0}: Error finding container 9805d9298092525a75c78bd99ca910a4bf91a90017b08307551acb4710ef0f68: Status 404 returned error can't find the container with id 9805d9298092525a75c78bd99ca910a4bf91a90017b08307551acb4710ef0f68 Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.758864 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" event={"ID":"7809862b-911c-4763-af00-a74f3fbf2500","Type":"ContainerStarted","Data":"243b5fea31103c3415f81359e3db7f940d7d50a74e1aebe60823f05f7ae58db8"} Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.759252 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" event={"ID":"7809862b-911c-4763-af00-a74f3fbf2500","Type":"ContainerStarted","Data":"9805d9298092525a75c78bd99ca910a4bf91a90017b08307551acb4710ef0f68"} Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.762953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" event={"ID":"57de3f43-e33f-4734-b02d-372d013b7e80","Type":"ContainerStarted","Data":"83d35cb72b27cfec9969608628b10b81e4e1659431946fa44a79242cb3e941e1"} Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.763020 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" event={"ID":"57de3f43-e33f-4734-b02d-372d013b7e80","Type":"ContainerStarted","Data":"0aee228d062bbddc74795824664752ec2183d082ae8088e128256da56508fb59"} Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.764872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29525760-mtv5d" event={"ID":"29390c83-c5f7-4c7a-8f48-9a02661a1108","Type":"ContainerStarted","Data":"a733d24075e34eb1adca6b5da2d53baa0ebfe92c14da71f66c920c375d7b4d91"} Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.765015 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29525760-mtv5d" event={"ID":"29390c83-c5f7-4c7a-8f48-9a02661a1108","Type":"ContainerStarted","Data":"671c229fbeb3efe9e21c8feda58904a1ee23355307111ba6fa2f34dcfa0ca6c1"} Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.768293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" event={"ID":"2fbbf12e-019a-40d4-9a07-46b3e5b4c814","Type":"ContainerStarted","Data":"2160bcdaff135da8488dd0341e2f0ef84c741af0030bb2e1d363fb7e5d8d2784"} Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.768390 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" event={"ID":"2fbbf12e-019a-40d4-9a07-46b3e5b4c814","Type":"ContainerStarted","Data":"a0cc8c7bdae191da08aaf6f8bc8f70504815a88c4a974a522c18d8e74303ae0a"} Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.781148 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" podStartSLOduration=1.781127589 podStartE2EDuration="1.781127589s" podCreationTimestamp="2026-02-20 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:00:01.770949761 +0000 UTC m=+9112.963467625" watchObservedRunningTime="2026-02-20 00:00:01.781127589 +0000 UTC m=+9112.973645453" Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.794915 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29525760-mtv5d" podStartSLOduration=1.794894518 podStartE2EDuration="1.794894518s" podCreationTimestamp="2026-02-20 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:00:01.790694889 +0000 UTC m=+9112.983212773" watchObservedRunningTime="2026-02-20 00:00:01.794894518 +0000 UTC m=+9112.987412382" Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.836507 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" podStartSLOduration=1.836487255 podStartE2EDuration="1.836487255s" podCreationTimestamp="2026-02-20 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:00:01.822143769 +0000 UTC m=+9113.014661643" watchObservedRunningTime="2026-02-20 00:00:01.836487255 +0000 UTC m=+9113.029005129" Feb 20 00:00:01 crc kubenswrapper[4795]: I0220 00:00:01.848030 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" podStartSLOduration=1.848015301 podStartE2EDuration="1.848015301s" podCreationTimestamp="2026-02-20 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:00:01.839935572 +0000 UTC m=+9113.032453436" watchObservedRunningTime="2026-02-20 00:00:01.848015301 +0000 UTC m=+9113.040533165" Feb 20 00:00:02 crc kubenswrapper[4795]: I0220 00:00:02.780346 4795 generic.go:334] "Generic (PLEG): container finished" podID="7809862b-911c-4763-af00-a74f3fbf2500" containerID="243b5fea31103c3415f81359e3db7f940d7d50a74e1aebe60823f05f7ae58db8" exitCode=0 Feb 20 00:00:02 crc kubenswrapper[4795]: I0220 00:00:02.782285 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" event={"ID":"7809862b-911c-4763-af00-a74f3fbf2500","Type":"ContainerDied","Data":"243b5fea31103c3415f81359e3db7f940d7d50a74e1aebe60823f05f7ae58db8"} Feb 20 00:00:03 crc kubenswrapper[4795]: I0220 00:00:03.791331 4795 generic.go:334] "Generic (PLEG): container finished" podID="29390c83-c5f7-4c7a-8f48-9a02661a1108" containerID="a733d24075e34eb1adca6b5da2d53baa0ebfe92c14da71f66c920c375d7b4d91" exitCode=0 Feb 20 00:00:03 crc kubenswrapper[4795]: I0220 00:00:03.791407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29525760-mtv5d" event={"ID":"29390c83-c5f7-4c7a-8f48-9a02661a1108","Type":"ContainerDied","Data":"a733d24075e34eb1adca6b5da2d53baa0ebfe92c14da71f66c920c375d7b4d91"} Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.248657 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.426069 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfxmk\" (UniqueName: \"kubernetes.io/projected/7809862b-911c-4763-af00-a74f3fbf2500-kube-api-access-sfxmk\") pod \"7809862b-911c-4763-af00-a74f3fbf2500\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.426184 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7809862b-911c-4763-af00-a74f3fbf2500-config-volume\") pod \"7809862b-911c-4763-af00-a74f3fbf2500\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.426319 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7809862b-911c-4763-af00-a74f3fbf2500-secret-volume\") pod \"7809862b-911c-4763-af00-a74f3fbf2500\" (UID: \"7809862b-911c-4763-af00-a74f3fbf2500\") " Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.427227 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7809862b-911c-4763-af00-a74f3fbf2500-config-volume" (OuterVolumeSpecName: "config-volume") pod "7809862b-911c-4763-af00-a74f3fbf2500" (UID: "7809862b-911c-4763-af00-a74f3fbf2500"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.445625 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7809862b-911c-4763-af00-a74f3fbf2500-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7809862b-911c-4763-af00-a74f3fbf2500" (UID: "7809862b-911c-4763-af00-a74f3fbf2500"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.445747 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7809862b-911c-4763-af00-a74f3fbf2500-kube-api-access-sfxmk" (OuterVolumeSpecName: "kube-api-access-sfxmk") pod "7809862b-911c-4763-af00-a74f3fbf2500" (UID: "7809862b-911c-4763-af00-a74f3fbf2500"). InnerVolumeSpecName "kube-api-access-sfxmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.457389 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95"] Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.467689 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525715-mrk95"] Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.529228 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfxmk\" (UniqueName: \"kubernetes.io/projected/7809862b-911c-4763-af00-a74f3fbf2500-kube-api-access-sfxmk\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.529264 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7809862b-911c-4763-af00-a74f3fbf2500-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.529274 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7809862b-911c-4763-af00-a74f3fbf2500-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.803405 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.810360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525760-rlfxx" event={"ID":"7809862b-911c-4763-af00-a74f3fbf2500","Type":"ContainerDied","Data":"9805d9298092525a75c78bd99ca910a4bf91a90017b08307551acb4710ef0f68"} Feb 20 00:00:04 crc kubenswrapper[4795]: I0220 00:00:04.810448 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9805d9298092525a75c78bd99ca910a4bf91a90017b08307551acb4710ef0f68" Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.178601 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.341539 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-858m8\" (UniqueName: \"kubernetes.io/projected/29390c83-c5f7-4c7a-8f48-9a02661a1108-kube-api-access-858m8\") pod \"29390c83-c5f7-4c7a-8f48-9a02661a1108\" (UID: \"29390c83-c5f7-4c7a-8f48-9a02661a1108\") " Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.341836 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29390c83-c5f7-4c7a-8f48-9a02661a1108-serviceca\") pod \"29390c83-c5f7-4c7a-8f48-9a02661a1108\" (UID: \"29390c83-c5f7-4c7a-8f48-9a02661a1108\") " Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.342448 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29390c83-c5f7-4c7a-8f48-9a02661a1108-serviceca" (OuterVolumeSpecName: "serviceca") pod "29390c83-c5f7-4c7a-8f48-9a02661a1108" (UID: "29390c83-c5f7-4c7a-8f48-9a02661a1108"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.349039 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29390c83-c5f7-4c7a-8f48-9a02661a1108-kube-api-access-858m8" (OuterVolumeSpecName: "kube-api-access-858m8") pod "29390c83-c5f7-4c7a-8f48-9a02661a1108" (UID: "29390c83-c5f7-4c7a-8f48-9a02661a1108"). InnerVolumeSpecName "kube-api-access-858m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.444523 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-858m8\" (UniqueName: \"kubernetes.io/projected/29390c83-c5f7-4c7a-8f48-9a02661a1108-kube-api-access-858m8\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.444568 4795 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29390c83-c5f7-4c7a-8f48-9a02661a1108-serviceca\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.524281 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eafa182-621e-48fe-a019-360c2f94c212" path="/var/lib/kubelet/pods/3eafa182-621e-48fe-a019-360c2f94c212/volumes" Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.813641 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29525760-mtv5d" event={"ID":"29390c83-c5f7-4c7a-8f48-9a02661a1108","Type":"ContainerDied","Data":"671c229fbeb3efe9e21c8feda58904a1ee23355307111ba6fa2f34dcfa0ca6c1"} Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.813693 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="671c229fbeb3efe9e21c8feda58904a1ee23355307111ba6fa2f34dcfa0ca6c1" Feb 20 00:00:05 crc kubenswrapper[4795]: I0220 00:00:05.813759 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29525760-mtv5d" Feb 20 00:00:07 crc kubenswrapper[4795]: I0220 00:00:07.832699 4795 generic.go:334] "Generic (PLEG): container finished" podID="57de3f43-e33f-4734-b02d-372d013b7e80" containerID="83d35cb72b27cfec9969608628b10b81e4e1659431946fa44a79242cb3e941e1" exitCode=0 Feb 20 00:00:07 crc kubenswrapper[4795]: I0220 00:00:07.832774 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" event={"ID":"57de3f43-e33f-4734-b02d-372d013b7e80","Type":"ContainerDied","Data":"83d35cb72b27cfec9969608628b10b81e4e1659431946fa44a79242cb3e941e1"} Feb 20 00:00:07 crc kubenswrapper[4795]: I0220 00:00:07.836674 4795 generic.go:334] "Generic (PLEG): container finished" podID="2fbbf12e-019a-40d4-9a07-46b3e5b4c814" containerID="2160bcdaff135da8488dd0341e2f0ef84c741af0030bb2e1d363fb7e5d8d2784" exitCode=0 Feb 20 00:00:07 crc kubenswrapper[4795]: I0220 00:00:07.836713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" event={"ID":"2fbbf12e-019a-40d4-9a07-46b3e5b4c814","Type":"ContainerDied","Data":"2160bcdaff135da8488dd0341e2f0ef84c741af0030bb2e1d363fb7e5d8d2784"} Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.369997 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.377232 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.527591 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-combined-ca-bundle\") pod \"57de3f43-e33f-4734-b02d-372d013b7e80\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.527797 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-config-data\") pod \"57de3f43-e33f-4734-b02d-372d013b7e80\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.527906 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmg5x\" (UniqueName: \"kubernetes.io/projected/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-kube-api-access-qmg5x\") pod \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.528029 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-scripts\") pod \"57de3f43-e33f-4734-b02d-372d013b7e80\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.528109 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-combined-ca-bundle\") pod \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.528284 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-scripts\") pod \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.528360 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwmfj\" (UniqueName: \"kubernetes.io/projected/57de3f43-e33f-4734-b02d-372d013b7e80-kube-api-access-qwmfj\") pod \"57de3f43-e33f-4734-b02d-372d013b7e80\" (UID: \"57de3f43-e33f-4734-b02d-372d013b7e80\") " Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.528378 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-config-data\") pod \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\" (UID: \"2fbbf12e-019a-40d4-9a07-46b3e5b4c814\") " Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.533372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-scripts" (OuterVolumeSpecName: "scripts") pod "2fbbf12e-019a-40d4-9a07-46b3e5b4c814" (UID: "2fbbf12e-019a-40d4-9a07-46b3e5b4c814"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.534064 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-scripts" (OuterVolumeSpecName: "scripts") pod "57de3f43-e33f-4734-b02d-372d013b7e80" (UID: "57de3f43-e33f-4734-b02d-372d013b7e80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.534299 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-kube-api-access-qmg5x" (OuterVolumeSpecName: "kube-api-access-qmg5x") pod "2fbbf12e-019a-40d4-9a07-46b3e5b4c814" (UID: "2fbbf12e-019a-40d4-9a07-46b3e5b4c814"). InnerVolumeSpecName "kube-api-access-qmg5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.535545 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57de3f43-e33f-4734-b02d-372d013b7e80-kube-api-access-qwmfj" (OuterVolumeSpecName: "kube-api-access-qwmfj") pod "57de3f43-e33f-4734-b02d-372d013b7e80" (UID: "57de3f43-e33f-4734-b02d-372d013b7e80"). InnerVolumeSpecName "kube-api-access-qwmfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.557087 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fbbf12e-019a-40d4-9a07-46b3e5b4c814" (UID: "2fbbf12e-019a-40d4-9a07-46b3e5b4c814"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.557679 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-config-data" (OuterVolumeSpecName: "config-data") pod "57de3f43-e33f-4734-b02d-372d013b7e80" (UID: "57de3f43-e33f-4734-b02d-372d013b7e80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.559724 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-config-data" (OuterVolumeSpecName: "config-data") pod "2fbbf12e-019a-40d4-9a07-46b3e5b4c814" (UID: "2fbbf12e-019a-40d4-9a07-46b3e5b4c814"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.562749 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57de3f43-e33f-4734-b02d-372d013b7e80" (UID: "57de3f43-e33f-4734-b02d-372d013b7e80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.631246 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.631275 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.631287 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.631295 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwmfj\" (UniqueName: \"kubernetes.io/projected/57de3f43-e33f-4734-b02d-372d013b7e80-kube-api-access-qwmfj\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.631308 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.631317 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.631325 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57de3f43-e33f-4734-b02d-372d013b7e80-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.631332 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmg5x\" (UniqueName: \"kubernetes.io/projected/2fbbf12e-019a-40d4-9a07-46b3e5b4c814-kube-api-access-qmg5x\") on node \"crc\" DevicePath \"\"" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.856694 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" event={"ID":"2fbbf12e-019a-40d4-9a07-46b3e5b4c814","Type":"ContainerDied","Data":"a0cc8c7bdae191da08aaf6f8bc8f70504815a88c4a974a522c18d8e74303ae0a"} Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.856733 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0cc8c7bdae191da08aaf6f8bc8f70504815a88c4a974a522c18d8e74303ae0a" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.856747 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-purge-29525760-9s2b5" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.857987 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" event={"ID":"57de3f43-e33f-4734-b02d-372d013b7e80","Type":"ContainerDied","Data":"0aee228d062bbddc74795824664752ec2183d082ae8088e128256da56508fb59"} Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.858016 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-purge-29525760-n9sd6" Feb 20 00:00:09 crc kubenswrapper[4795]: I0220 00:00:09.858022 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aee228d062bbddc74795824664752ec2183d082ae8088e128256da56508fb59" Feb 20 00:00:10 crc kubenswrapper[4795]: I0220 00:00:10.511652 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:00:10 crc kubenswrapper[4795]: E0220 00:00:10.512045 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:00:23 crc kubenswrapper[4795]: I0220 00:00:23.512031 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:00:23 crc kubenswrapper[4795]: E0220 00:00:23.512826 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:00:29 crc kubenswrapper[4795]: I0220 00:00:29.284775 4795 scope.go:117] "RemoveContainer" containerID="506e26301ea1d8c97cb2f36560ab4d1582f8a10338a4a109885babb88591cfe0" Feb 20 00:00:35 crc kubenswrapper[4795]: I0220 00:00:35.512380 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:00:35 crc kubenswrapper[4795]: E0220 00:00:35.513361 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:00:48 crc kubenswrapper[4795]: I0220 00:00:48.512043 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:00:48 crc kubenswrapper[4795]: E0220 00:00:48.512802 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.191536 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-purge-29525761-zbskn"] Feb 20 00:01:00 crc kubenswrapper[4795]: E0220 00:01:00.192796 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57de3f43-e33f-4734-b02d-372d013b7e80" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.192813 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="57de3f43-e33f-4734-b02d-372d013b7e80" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4795]: E0220 00:01:00.192829 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7809862b-911c-4763-af00-a74f3fbf2500" containerName="collect-profiles" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.192837 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7809862b-911c-4763-af00-a74f3fbf2500" containerName="collect-profiles" Feb 20 00:01:00 crc kubenswrapper[4795]: E0220 00:01:00.192876 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29390c83-c5f7-4c7a-8f48-9a02661a1108" containerName="image-pruner" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.192885 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="29390c83-c5f7-4c7a-8f48-9a02661a1108" containerName="image-pruner" Feb 20 00:01:00 crc kubenswrapper[4795]: E0220 00:01:00.192900 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fbbf12e-019a-40d4-9a07-46b3e5b4c814" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.192908 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbbf12e-019a-40d4-9a07-46b3e5b4c814" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.193140 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="29390c83-c5f7-4c7a-8f48-9a02661a1108" containerName="image-pruner" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.193193 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="57de3f43-e33f-4734-b02d-372d013b7e80" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.193214 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fbbf12e-019a-40d4-9a07-46b3e5b4c814" containerName="nova-manage" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.193232 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7809862b-911c-4763-af00-a74f3fbf2500" containerName="collect-profiles" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.194181 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.201597 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-config-data\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.201747 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-combined-ca-bundle\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.201946 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2bpx\" (UniqueName: \"kubernetes.io/projected/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-kube-api-access-r2bpx\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.202072 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-db-purge-config-data\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.212857 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-purge-29525761-5slgz"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.215004 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.239853 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-purge-29525761-lcwsj"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.241807 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29525761-d78q6"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.243046 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.243244 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.248725 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.248917 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-purge-29525761-tq4s7"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.252316 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.260657 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-purge-29525761-5slgz"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.275578 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-purge-29525761-tq4s7"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.288723 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-purge-29525761-lcwsj"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.304566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2bpx\" (UniqueName: \"kubernetes.io/projected/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-kube-api-access-r2bpx\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.304657 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-db-purge-config-data\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.304728 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-config-data\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.304777 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-combined-ca-bundle\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.314115 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525761-d78q6"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.319052 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-db-purge-config-data\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.324929 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-combined-ca-bundle\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.335448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-config-data\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.336768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2bpx\" (UniqueName: \"kubernetes.io/projected/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-kube-api-access-r2bpx\") pod \"cinder-db-purge-29525761-zbskn\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.398354 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-purge-29525761-zbskn"] Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.409292 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-config-data\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.409336 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpt9n\" (UniqueName: \"kubernetes.io/projected/9344b7be-b07a-4660-9352-dfdbcecac424-kube-api-access-kpt9n\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.409366 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-db-purge-config-data\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.409403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-combined-ca-bundle\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.409555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-combined-ca-bundle\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-fernet-keys\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413103 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-config-data\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413125 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97sdm\" (UniqueName: \"kubernetes.io/projected/8dc6e226-d501-4698-b49c-f07fc8e80339-kube-api-access-97sdm\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413146 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-combined-ca-bundle\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413183 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-db-purge-config-data\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413246 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fd9z\" (UniqueName: \"kubernetes.io/projected/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-kube-api-access-4fd9z\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413326 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b8t2\" (UniqueName: \"kubernetes.io/projected/e307d045-9890-4475-8c51-395484da10ca-kube-api-access-8b8t2\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413354 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-combined-ca-bundle\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-db-purge-config-data\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.413447 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-config-data\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.511456 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:01:00 crc kubenswrapper[4795]: E0220 00:01:00.511761 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.516950 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-config-data\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.517006 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpt9n\" (UniqueName: \"kubernetes.io/projected/9344b7be-b07a-4660-9352-dfdbcecac424-kube-api-access-kpt9n\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.517038 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-db-purge-config-data\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.517116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-combined-ca-bundle\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.517186 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-combined-ca-bundle\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.519268 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-fernet-keys\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.519305 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-config-data\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.519349 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97sdm\" (UniqueName: \"kubernetes.io/projected/8dc6e226-d501-4698-b49c-f07fc8e80339-kube-api-access-97sdm\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.519383 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-combined-ca-bundle\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.519440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-db-purge-config-data\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.519546 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fd9z\" (UniqueName: \"kubernetes.io/projected/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-kube-api-access-4fd9z\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.520067 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b8t2\" (UniqueName: \"kubernetes.io/projected/e307d045-9890-4475-8c51-395484da10ca-kube-api-access-8b8t2\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.520372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-combined-ca-bundle\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.520692 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-db-purge-config-data\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.521127 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-config-data\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.526675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-combined-ca-bundle\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.535320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-fernet-keys\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.543098 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpt9n\" (UniqueName: \"kubernetes.io/projected/9344b7be-b07a-4660-9352-dfdbcecac424-kube-api-access-kpt9n\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.553000 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-db-purge-config-data\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.556806 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-combined-ca-bundle\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.557316 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-config-data\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.564188 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-db-purge-config-data\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.564396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-db-purge-config-data\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.580312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fd9z\" (UniqueName: \"kubernetes.io/projected/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-kube-api-access-4fd9z\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.586816 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.588418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b8t2\" (UniqueName: \"kubernetes.io/projected/e307d045-9890-4475-8c51-395484da10ca-kube-api-access-8b8t2\") pod \"keystone-cron-29525761-d78q6\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.588900 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-config-data\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.588985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-combined-ca-bundle\") pod \"glance-db-purge-29525761-lcwsj\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.591824 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97sdm\" (UniqueName: \"kubernetes.io/projected/8dc6e226-d501-4698-b49c-f07fc8e80339-kube-api-access-97sdm\") pod \"manila-db-purge-29525761-5slgz\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.597764 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-combined-ca-bundle\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.605791 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.606690 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-config-data\") pod \"heat-db-purge-29525761-tq4s7\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.737808 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.746693 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:00 crc kubenswrapper[4795]: I0220 00:01:00.755618 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:01 crc kubenswrapper[4795]: I0220 00:01:01.101444 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-purge-29525761-zbskn"] Feb 20 00:01:01 crc kubenswrapper[4795]: W0220 00:01:01.322250 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dc6e226_d501_4698_b49c_f07fc8e80339.slice/crio-839fcc1ae14bfe08de5f8cc957c2df706e67ee9e3944f751bcde5c02a76b7c98 WatchSource:0}: Error finding container 839fcc1ae14bfe08de5f8cc957c2df706e67ee9e3944f751bcde5c02a76b7c98: Status 404 returned error can't find the container with id 839fcc1ae14bfe08de5f8cc957c2df706e67ee9e3944f751bcde5c02a76b7c98 Feb 20 00:01:01 crc kubenswrapper[4795]: I0220 00:01:01.326451 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-purge-29525761-5slgz"] Feb 20 00:01:01 crc kubenswrapper[4795]: I0220 00:01:01.341683 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525761-d78q6"] Feb 20 00:01:01 crc kubenswrapper[4795]: I0220 00:01:01.389711 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525761-d78q6" event={"ID":"e307d045-9890-4475-8c51-395484da10ca","Type":"ContainerStarted","Data":"4775fd8acc8285432dc75f9369474eda69ed12d7e07fe84286320ce0fe245768"} Feb 20 00:01:01 crc kubenswrapper[4795]: I0220 00:01:01.415883 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-purge-29525761-lcwsj"] Feb 20 00:01:01 crc kubenswrapper[4795]: I0220 00:01:01.418570 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-purge-29525761-5slgz" event={"ID":"8dc6e226-d501-4698-b49c-f07fc8e80339","Type":"ContainerStarted","Data":"839fcc1ae14bfe08de5f8cc957c2df706e67ee9e3944f751bcde5c02a76b7c98"} Feb 20 00:01:01 crc kubenswrapper[4795]: I0220 00:01:01.422055 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29525761-zbskn" event={"ID":"fe324720-6e0b-4d15-bc6e-3875b26bf7f4","Type":"ContainerStarted","Data":"d245ab4deae1c20cb273c4781102e64ba81f2352131918cd3603be3874e3b24b"} Feb 20 00:01:01 crc kubenswrapper[4795]: I0220 00:01:01.427941 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-purge-29525761-tq4s7"] Feb 20 00:01:01 crc kubenswrapper[4795]: W0220 00:01:01.429061 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9344b7be_b07a_4660_9352_dfdbcecac424.slice/crio-f6461fca43460d2524cc1869187a90e6e92ad799d2b396d33e0f626bc7a5e79e WatchSource:0}: Error finding container f6461fca43460d2524cc1869187a90e6e92ad799d2b396d33e0f626bc7a5e79e: Status 404 returned error can't find the container with id f6461fca43460d2524cc1869187a90e6e92ad799d2b396d33e0f626bc7a5e79e Feb 20 00:01:01 crc kubenswrapper[4795]: W0220 00:01:01.454830 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd673a8e7_fd1c_4bd1_ad6b_fb18a187b5cb.slice/crio-735a70899c25a3ec127ed998e46a3e97bab5c4645f2d2852a58f1b4d84ae9c13 WatchSource:0}: Error finding container 735a70899c25a3ec127ed998e46a3e97bab5c4645f2d2852a58f1b4d84ae9c13: Status 404 returned error can't find the container with id 735a70899c25a3ec127ed998e46a3e97bab5c4645f2d2852a58f1b4d84ae9c13 Feb 20 00:01:02 crc kubenswrapper[4795]: I0220 00:01:02.455568 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29525761-lcwsj" event={"ID":"9344b7be-b07a-4660-9352-dfdbcecac424","Type":"ContainerStarted","Data":"f6461fca43460d2524cc1869187a90e6e92ad799d2b396d33e0f626bc7a5e79e"} Feb 20 00:01:02 crc kubenswrapper[4795]: I0220 00:01:02.460778 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-purge-29525761-tq4s7" event={"ID":"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb","Type":"ContainerStarted","Data":"faad2ded953d5657f7a517328a5146fa339ecadd1a2f2c12485311d9ecfd00f1"} Feb 20 00:01:02 crc kubenswrapper[4795]: I0220 00:01:02.460857 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-purge-29525761-tq4s7" event={"ID":"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb","Type":"ContainerStarted","Data":"735a70899c25a3ec127ed998e46a3e97bab5c4645f2d2852a58f1b4d84ae9c13"} Feb 20 00:01:02 crc kubenswrapper[4795]: I0220 00:01:02.480945 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-purge-29525761-tq4s7" podStartSLOduration=2.480924389 podStartE2EDuration="2.480924389s" podCreationTimestamp="2026-02-20 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:01:02.478762238 +0000 UTC m=+9173.671280102" watchObservedRunningTime="2026-02-20 00:01:02.480924389 +0000 UTC m=+9173.673442253" Feb 20 00:01:02 crc kubenswrapper[4795]: I0220 00:01:02.521116 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525761-d78q6" event={"ID":"e307d045-9890-4475-8c51-395484da10ca","Type":"ContainerStarted","Data":"d952c9318d01d0f2fbed23c519883c7b2ce798dd43e62951ff40a36d13484b9f"} Feb 20 00:01:02 crc kubenswrapper[4795]: I0220 00:01:02.525492 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-purge-29525761-5slgz" event={"ID":"8dc6e226-d501-4698-b49c-f07fc8e80339","Type":"ContainerStarted","Data":"7f1b10e04140da2231a173788a781af5c72a39132101aedc853450f4470db79d"} Feb 20 00:01:02 crc kubenswrapper[4795]: I0220 00:01:02.545424 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29525761-d78q6" podStartSLOduration=2.5453996119999998 podStartE2EDuration="2.545399612s" podCreationTimestamp="2026-02-20 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:01:02.538974231 +0000 UTC m=+9173.731492095" watchObservedRunningTime="2026-02-20 00:01:02.545399612 +0000 UTC m=+9173.737917476" Feb 20 00:01:02 crc kubenswrapper[4795]: I0220 00:01:02.576124 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-purge-29525761-5slgz" podStartSLOduration=2.576105011 podStartE2EDuration="2.576105011s" podCreationTimestamp="2026-02-20 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:01:02.565276815 +0000 UTC m=+9173.757794679" watchObservedRunningTime="2026-02-20 00:01:02.576105011 +0000 UTC m=+9173.768622875" Feb 20 00:01:03 crc kubenswrapper[4795]: I0220 00:01:03.537941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29525761-lcwsj" event={"ID":"9344b7be-b07a-4660-9352-dfdbcecac424","Type":"ContainerStarted","Data":"d03ab37f975de2c279ad295fe976f236b962ccc100e8c364d616ff19c0f5983e"} Feb 20 00:01:03 crc kubenswrapper[4795]: I0220 00:01:03.540770 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29525761-zbskn" event={"ID":"fe324720-6e0b-4d15-bc6e-3875b26bf7f4","Type":"ContainerStarted","Data":"4f952e7e5b352c1a9af1933c2d718e36ecf71d34b2f2efc9a84b624480ca61e8"} Feb 20 00:01:03 crc kubenswrapper[4795]: I0220 00:01:03.559626 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-purge-29525761-lcwsj" podStartSLOduration=3.559604756 podStartE2EDuration="3.559604756s" podCreationTimestamp="2026-02-20 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:01:03.558920027 +0000 UTC m=+9174.751437891" watchObservedRunningTime="2026-02-20 00:01:03.559604756 +0000 UTC m=+9174.752122620" Feb 20 00:01:03 crc kubenswrapper[4795]: I0220 00:01:03.579099 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-purge-29525761-zbskn" podStartSLOduration=3.579078407 podStartE2EDuration="3.579078407s" podCreationTimestamp="2026-02-20 00:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 00:01:03.575502726 +0000 UTC m=+9174.768020590" watchObservedRunningTime="2026-02-20 00:01:03.579078407 +0000 UTC m=+9174.771596261" Feb 20 00:01:04 crc kubenswrapper[4795]: I0220 00:01:04.551231 4795 generic.go:334] "Generic (PLEG): container finished" podID="8dc6e226-d501-4698-b49c-f07fc8e80339" containerID="7f1b10e04140da2231a173788a781af5c72a39132101aedc853450f4470db79d" exitCode=0 Feb 20 00:01:04 crc kubenswrapper[4795]: I0220 00:01:04.551313 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-purge-29525761-5slgz" event={"ID":"8dc6e226-d501-4698-b49c-f07fc8e80339","Type":"ContainerDied","Data":"7f1b10e04140da2231a173788a781af5c72a39132101aedc853450f4470db79d"} Feb 20 00:01:05 crc kubenswrapper[4795]: I0220 00:01:05.565126 4795 generic.go:334] "Generic (PLEG): container finished" podID="9344b7be-b07a-4660-9352-dfdbcecac424" containerID="d03ab37f975de2c279ad295fe976f236b962ccc100e8c364d616ff19c0f5983e" exitCode=0 Feb 20 00:01:05 crc kubenswrapper[4795]: I0220 00:01:05.565509 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29525761-lcwsj" event={"ID":"9344b7be-b07a-4660-9352-dfdbcecac424","Type":"ContainerDied","Data":"d03ab37f975de2c279ad295fe976f236b962ccc100e8c364d616ff19c0f5983e"} Feb 20 00:01:05 crc kubenswrapper[4795]: I0220 00:01:05.568644 4795 generic.go:334] "Generic (PLEG): container finished" podID="d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb" containerID="faad2ded953d5657f7a517328a5146fa339ecadd1a2f2c12485311d9ecfd00f1" exitCode=0 Feb 20 00:01:05 crc kubenswrapper[4795]: I0220 00:01:05.568699 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-purge-29525761-tq4s7" event={"ID":"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb","Type":"ContainerDied","Data":"faad2ded953d5657f7a517328a5146fa339ecadd1a2f2c12485311d9ecfd00f1"} Feb 20 00:01:05 crc kubenswrapper[4795]: I0220 00:01:05.570446 4795 generic.go:334] "Generic (PLEG): container finished" podID="e307d045-9890-4475-8c51-395484da10ca" containerID="d952c9318d01d0f2fbed23c519883c7b2ce798dd43e62951ff40a36d13484b9f" exitCode=0 Feb 20 00:01:05 crc kubenswrapper[4795]: I0220 00:01:05.570515 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525761-d78q6" event={"ID":"e307d045-9890-4475-8c51-395484da10ca","Type":"ContainerDied","Data":"d952c9318d01d0f2fbed23c519883c7b2ce798dd43e62951ff40a36d13484b9f"} Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.024740 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.118568 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-combined-ca-bundle\") pod \"8dc6e226-d501-4698-b49c-f07fc8e80339\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.118644 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-db-purge-config-data\") pod \"8dc6e226-d501-4698-b49c-f07fc8e80339\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.118873 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97sdm\" (UniqueName: \"kubernetes.io/projected/8dc6e226-d501-4698-b49c-f07fc8e80339-kube-api-access-97sdm\") pod \"8dc6e226-d501-4698-b49c-f07fc8e80339\" (UID: \"8dc6e226-d501-4698-b49c-f07fc8e80339\") " Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.127306 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "8dc6e226-d501-4698-b49c-f07fc8e80339" (UID: "8dc6e226-d501-4698-b49c-f07fc8e80339"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.134578 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dc6e226-d501-4698-b49c-f07fc8e80339-kube-api-access-97sdm" (OuterVolumeSpecName: "kube-api-access-97sdm") pod "8dc6e226-d501-4698-b49c-f07fc8e80339" (UID: "8dc6e226-d501-4698-b49c-f07fc8e80339"). InnerVolumeSpecName "kube-api-access-97sdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.166360 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dc6e226-d501-4698-b49c-f07fc8e80339" (UID: "8dc6e226-d501-4698-b49c-f07fc8e80339"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.221200 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97sdm\" (UniqueName: \"kubernetes.io/projected/8dc6e226-d501-4698-b49c-f07fc8e80339-kube-api-access-97sdm\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.221231 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.221240 4795 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/8dc6e226-d501-4698-b49c-f07fc8e80339-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.592904 4795 generic.go:334] "Generic (PLEG): container finished" podID="fe324720-6e0b-4d15-bc6e-3875b26bf7f4" containerID="4f952e7e5b352c1a9af1933c2d718e36ecf71d34b2f2efc9a84b624480ca61e8" exitCode=0 Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.595141 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29525761-zbskn" event={"ID":"fe324720-6e0b-4d15-bc6e-3875b26bf7f4","Type":"ContainerDied","Data":"4f952e7e5b352c1a9af1933c2d718e36ecf71d34b2f2efc9a84b624480ca61e8"} Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.597899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-purge-29525761-5slgz" event={"ID":"8dc6e226-d501-4698-b49c-f07fc8e80339","Type":"ContainerDied","Data":"839fcc1ae14bfe08de5f8cc957c2df706e67ee9e3944f751bcde5c02a76b7c98"} Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.597964 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="839fcc1ae14bfe08de5f8cc957c2df706e67ee9e3944f751bcde5c02a76b7c98" Feb 20 00:01:06 crc kubenswrapper[4795]: I0220 00:01:06.598020 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-purge-29525761-5slgz" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.077997 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.171592 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-config-data\") pod \"e307d045-9890-4475-8c51-395484da10ca\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.171686 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b8t2\" (UniqueName: \"kubernetes.io/projected/e307d045-9890-4475-8c51-395484da10ca-kube-api-access-8b8t2\") pod \"e307d045-9890-4475-8c51-395484da10ca\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.171769 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-fernet-keys\") pod \"e307d045-9890-4475-8c51-395484da10ca\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.171788 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-combined-ca-bundle\") pod \"e307d045-9890-4475-8c51-395484da10ca\" (UID: \"e307d045-9890-4475-8c51-395484da10ca\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.177544 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e307d045-9890-4475-8c51-395484da10ca-kube-api-access-8b8t2" (OuterVolumeSpecName: "kube-api-access-8b8t2") pod "e307d045-9890-4475-8c51-395484da10ca" (UID: "e307d045-9890-4475-8c51-395484da10ca"). InnerVolumeSpecName "kube-api-access-8b8t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.177836 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e307d045-9890-4475-8c51-395484da10ca" (UID: "e307d045-9890-4475-8c51-395484da10ca"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.201306 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e307d045-9890-4475-8c51-395484da10ca" (UID: "e307d045-9890-4475-8c51-395484da10ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.236140 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-config-data" (OuterVolumeSpecName: "config-data") pod "e307d045-9890-4475-8c51-395484da10ca" (UID: "e307d045-9890-4475-8c51-395484da10ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.258266 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.264462 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.279176 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.279209 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.279222 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e307d045-9890-4475-8c51-395484da10ca-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.279232 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b8t2\" (UniqueName: \"kubernetes.io/projected/e307d045-9890-4475-8c51-395484da10ca-kube-api-access-8b8t2\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.380159 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-combined-ca-bundle\") pod \"9344b7be-b07a-4660-9352-dfdbcecac424\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.380256 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-db-purge-config-data\") pod \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.380302 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-db-purge-config-data\") pod \"9344b7be-b07a-4660-9352-dfdbcecac424\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.380345 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-config-data\") pod \"9344b7be-b07a-4660-9352-dfdbcecac424\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.380479 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-combined-ca-bundle\") pod \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.380508 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-config-data\") pod \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.380555 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fd9z\" (UniqueName: \"kubernetes.io/projected/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-kube-api-access-4fd9z\") pod \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\" (UID: \"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.380592 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpt9n\" (UniqueName: \"kubernetes.io/projected/9344b7be-b07a-4660-9352-dfdbcecac424-kube-api-access-kpt9n\") pod \"9344b7be-b07a-4660-9352-dfdbcecac424\" (UID: \"9344b7be-b07a-4660-9352-dfdbcecac424\") " Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.384706 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9344b7be-b07a-4660-9352-dfdbcecac424-kube-api-access-kpt9n" (OuterVolumeSpecName: "kube-api-access-kpt9n") pod "9344b7be-b07a-4660-9352-dfdbcecac424" (UID: "9344b7be-b07a-4660-9352-dfdbcecac424"). InnerVolumeSpecName "kube-api-access-kpt9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.385781 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "9344b7be-b07a-4660-9352-dfdbcecac424" (UID: "9344b7be-b07a-4660-9352-dfdbcecac424"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.386307 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb" (UID: "d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.386460 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-kube-api-access-4fd9z" (OuterVolumeSpecName: "kube-api-access-4fd9z") pod "d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb" (UID: "d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb"). InnerVolumeSpecName "kube-api-access-4fd9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.412426 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-config-data" (OuterVolumeSpecName: "config-data") pod "d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb" (UID: "d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.413038 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-config-data" (OuterVolumeSpecName: "config-data") pod "9344b7be-b07a-4660-9352-dfdbcecac424" (UID: "9344b7be-b07a-4660-9352-dfdbcecac424"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.416520 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9344b7be-b07a-4660-9352-dfdbcecac424" (UID: "9344b7be-b07a-4660-9352-dfdbcecac424"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.419477 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb" (UID: "d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.490736 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.490767 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.490786 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fd9z\" (UniqueName: \"kubernetes.io/projected/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-kube-api-access-4fd9z\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.490804 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpt9n\" (UniqueName: \"kubernetes.io/projected/9344b7be-b07a-4660-9352-dfdbcecac424-kube-api-access-kpt9n\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.490813 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.490822 4795 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.490832 4795 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.490844 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9344b7be-b07a-4660-9352-dfdbcecac424-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.618456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-purge-29525761-lcwsj" event={"ID":"9344b7be-b07a-4660-9352-dfdbcecac424","Type":"ContainerDied","Data":"f6461fca43460d2524cc1869187a90e6e92ad799d2b396d33e0f626bc7a5e79e"} Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.618823 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6461fca43460d2524cc1869187a90e6e92ad799d2b396d33e0f626bc7a5e79e" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.619226 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-purge-29525761-lcwsj" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.621653 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-purge-29525761-tq4s7" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.621692 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-purge-29525761-tq4s7" event={"ID":"d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb","Type":"ContainerDied","Data":"735a70899c25a3ec127ed998e46a3e97bab5c4645f2d2852a58f1b4d84ae9c13"} Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.621727 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="735a70899c25a3ec127ed998e46a3e97bab5c4645f2d2852a58f1b4d84ae9c13" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.630150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525761-d78q6" event={"ID":"e307d045-9890-4475-8c51-395484da10ca","Type":"ContainerDied","Data":"4775fd8acc8285432dc75f9369474eda69ed12d7e07fe84286320ce0fe245768"} Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.630196 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525761-d78q6" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.630216 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4775fd8acc8285432dc75f9369474eda69ed12d7e07fe84286320ce0fe245768" Feb 20 00:01:07 crc kubenswrapper[4795]: I0220 00:01:07.865294 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.003099 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-db-purge-config-data\") pod \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.003209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2bpx\" (UniqueName: \"kubernetes.io/projected/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-kube-api-access-r2bpx\") pod \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.003374 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-combined-ca-bundle\") pod \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.003432 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-config-data\") pod \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\" (UID: \"fe324720-6e0b-4d15-bc6e-3875b26bf7f4\") " Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.008709 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-db-purge-config-data" (OuterVolumeSpecName: "db-purge-config-data") pod "fe324720-6e0b-4d15-bc6e-3875b26bf7f4" (UID: "fe324720-6e0b-4d15-bc6e-3875b26bf7f4"). InnerVolumeSpecName "db-purge-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.008787 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-kube-api-access-r2bpx" (OuterVolumeSpecName: "kube-api-access-r2bpx") pod "fe324720-6e0b-4d15-bc6e-3875b26bf7f4" (UID: "fe324720-6e0b-4d15-bc6e-3875b26bf7f4"). InnerVolumeSpecName "kube-api-access-r2bpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.031767 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-config-data" (OuterVolumeSpecName: "config-data") pod "fe324720-6e0b-4d15-bc6e-3875b26bf7f4" (UID: "fe324720-6e0b-4d15-bc6e-3875b26bf7f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.036845 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe324720-6e0b-4d15-bc6e-3875b26bf7f4" (UID: "fe324720-6e0b-4d15-bc6e-3875b26bf7f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.105923 4795 reconciler_common.go:293] "Volume detached for volume \"db-purge-config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-db-purge-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.105959 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2bpx\" (UniqueName: \"kubernetes.io/projected/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-kube-api-access-r2bpx\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.105969 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.105978 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe324720-6e0b-4d15-bc6e-3875b26bf7f4-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.645322 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-purge-29525761-zbskn" event={"ID":"fe324720-6e0b-4d15-bc6e-3875b26bf7f4","Type":"ContainerDied","Data":"d245ab4deae1c20cb273c4781102e64ba81f2352131918cd3603be3874e3b24b"} Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.645363 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d245ab4deae1c20cb273c4781102e64ba81f2352131918cd3603be3874e3b24b" Feb 20 00:01:08 crc kubenswrapper[4795]: I0220 00:01:08.645423 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-purge-29525761-zbskn" Feb 20 00:01:15 crc kubenswrapper[4795]: I0220 00:01:15.511907 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:01:15 crc kubenswrapper[4795]: E0220 00:01:15.512779 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:01:27 crc kubenswrapper[4795]: I0220 00:01:27.511746 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:01:27 crc kubenswrapper[4795]: E0220 00:01:27.512503 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:01:40 crc kubenswrapper[4795]: I0220 00:01:40.511952 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:01:40 crc kubenswrapper[4795]: E0220 00:01:40.513864 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:01:51 crc kubenswrapper[4795]: I0220 00:01:51.512476 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:01:51 crc kubenswrapper[4795]: E0220 00:01:51.513618 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:02:02 crc kubenswrapper[4795]: I0220 00:02:02.512211 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:02:02 crc kubenswrapper[4795]: E0220 00:02:02.512971 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:02:13 crc kubenswrapper[4795]: I0220 00:02:13.511981 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:02:13 crc kubenswrapper[4795]: E0220 00:02:13.512876 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:02:27 crc kubenswrapper[4795]: I0220 00:02:27.513366 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:02:27 crc kubenswrapper[4795]: E0220 00:02:27.514732 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:02:29 crc kubenswrapper[4795]: I0220 00:02:29.357845 4795 scope.go:117] "RemoveContainer" containerID="de244572ed412b6d15883ac7be00a681de7409d595350d300ebc7aac8f96c690" Feb 20 00:02:29 crc kubenswrapper[4795]: I0220 00:02:29.381859 4795 scope.go:117] "RemoveContainer" containerID="62deb3f2bf568b395b46bc81de81ee26457a95f02f006897224aa24bc7dac271" Feb 20 00:02:29 crc kubenswrapper[4795]: I0220 00:02:29.425719 4795 scope.go:117] "RemoveContainer" containerID="3644eb2ff205a6ecbf078ee98cc404ff628a9964f7f066ce4f38a2487488db5f" Feb 20 00:02:29 crc kubenswrapper[4795]: I0220 00:02:29.479240 4795 scope.go:117] "RemoveContainer" containerID="6d065a9d149243251b6b58ebaf53b077b0910d5423b5a4944782ee17e37027c2" Feb 20 00:02:29 crc kubenswrapper[4795]: I0220 00:02:29.523358 4795 scope.go:117] "RemoveContainer" containerID="07a4317117426bae86bf07c8d5e610095ccbae15e950025068b03dd800e0da4a" Feb 20 00:02:38 crc kubenswrapper[4795]: I0220 00:02:38.512543 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:02:38 crc kubenswrapper[4795]: E0220 00:02:38.513662 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:02:52 crc kubenswrapper[4795]: I0220 00:02:52.512137 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:02:52 crc kubenswrapper[4795]: E0220 00:02:52.513264 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:03:05 crc kubenswrapper[4795]: I0220 00:03:05.512364 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:03:05 crc kubenswrapper[4795]: E0220 00:03:05.513370 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:03:20 crc kubenswrapper[4795]: I0220 00:03:20.512148 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:03:20 crc kubenswrapper[4795]: E0220 00:03:20.513141 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:03:32 crc kubenswrapper[4795]: I0220 00:03:32.512555 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:03:32 crc kubenswrapper[4795]: E0220 00:03:32.513409 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:03:45 crc kubenswrapper[4795]: I0220 00:03:45.512914 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:03:45 crc kubenswrapper[4795]: E0220 00:03:45.513848 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:03:59 crc kubenswrapper[4795]: I0220 00:03:59.519010 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:03:59 crc kubenswrapper[4795]: E0220 00:03:59.520122 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:04:12 crc kubenswrapper[4795]: I0220 00:04:12.512071 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:04:12 crc kubenswrapper[4795]: E0220 00:04:12.528860 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.798371 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t6sqk"] Feb 20 00:04:25 crc kubenswrapper[4795]: E0220 00:04:25.799241 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe324720-6e0b-4d15-bc6e-3875b26bf7f4" containerName="cinder-db-purge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799254 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe324720-6e0b-4d15-bc6e-3875b26bf7f4" containerName="cinder-db-purge" Feb 20 00:04:25 crc kubenswrapper[4795]: E0220 00:04:25.799269 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc6e226-d501-4698-b49c-f07fc8e80339" containerName="manila-db-purge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799275 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc6e226-d501-4698-b49c-f07fc8e80339" containerName="manila-db-purge" Feb 20 00:04:25 crc kubenswrapper[4795]: E0220 00:04:25.799299 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9344b7be-b07a-4660-9352-dfdbcecac424" containerName="glance-dbpurge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799305 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9344b7be-b07a-4660-9352-dfdbcecac424" containerName="glance-dbpurge" Feb 20 00:04:25 crc kubenswrapper[4795]: E0220 00:04:25.799320 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e307d045-9890-4475-8c51-395484da10ca" containerName="keystone-cron" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799326 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e307d045-9890-4475-8c51-395484da10ca" containerName="keystone-cron" Feb 20 00:04:25 crc kubenswrapper[4795]: E0220 00:04:25.799333 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb" containerName="heat-dbpurge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799338 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb" containerName="heat-dbpurge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799515 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe324720-6e0b-4d15-bc6e-3875b26bf7f4" containerName="cinder-db-purge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799530 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e307d045-9890-4475-8c51-395484da10ca" containerName="keystone-cron" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799539 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb" containerName="heat-dbpurge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799558 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc6e226-d501-4698-b49c-f07fc8e80339" containerName="manila-db-purge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.799568 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9344b7be-b07a-4660-9352-dfdbcecac424" containerName="glance-dbpurge" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.801022 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.826898 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t6sqk"] Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.913944 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-catalog-content\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.914024 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-utilities\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:25 crc kubenswrapper[4795]: I0220 00:04:25.914335 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pkd6\" (UniqueName: \"kubernetes.io/projected/77dfee48-ba66-4c86-80c3-47b740e7e1c3-kube-api-access-9pkd6\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.017111 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-utilities\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.017231 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pkd6\" (UniqueName: \"kubernetes.io/projected/77dfee48-ba66-4c86-80c3-47b740e7e1c3-kube-api-access-9pkd6\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.017382 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-catalog-content\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.017915 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-utilities\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.017923 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-catalog-content\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.043038 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pkd6\" (UniqueName: \"kubernetes.io/projected/77dfee48-ba66-4c86-80c3-47b740e7e1c3-kube-api-access-9pkd6\") pod \"certified-operators-t6sqk\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.184277 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.513434 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:04:26 crc kubenswrapper[4795]: E0220 00:04:26.513990 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:04:26 crc kubenswrapper[4795]: I0220 00:04:26.890347 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t6sqk"] Feb 20 00:04:27 crc kubenswrapper[4795]: I0220 00:04:27.860070 4795 generic.go:334] "Generic (PLEG): container finished" podID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerID="ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae" exitCode=0 Feb 20 00:04:27 crc kubenswrapper[4795]: I0220 00:04:27.860141 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6sqk" event={"ID":"77dfee48-ba66-4c86-80c3-47b740e7e1c3","Type":"ContainerDied","Data":"ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae"} Feb 20 00:04:27 crc kubenswrapper[4795]: I0220 00:04:27.860425 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6sqk" event={"ID":"77dfee48-ba66-4c86-80c3-47b740e7e1c3","Type":"ContainerStarted","Data":"7ac04f15babc163d9a654978f2668463945a212502dd56a6aa7609a7a093b1d6"} Feb 20 00:04:27 crc kubenswrapper[4795]: I0220 00:04:27.862821 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 00:04:29 crc kubenswrapper[4795]: I0220 00:04:29.883444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6sqk" event={"ID":"77dfee48-ba66-4c86-80c3-47b740e7e1c3","Type":"ContainerStarted","Data":"d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09"} Feb 20 00:04:30 crc kubenswrapper[4795]: I0220 00:04:30.928933 4795 generic.go:334] "Generic (PLEG): container finished" podID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerID="d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09" exitCode=0 Feb 20 00:04:30 crc kubenswrapper[4795]: I0220 00:04:30.929047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6sqk" event={"ID":"77dfee48-ba66-4c86-80c3-47b740e7e1c3","Type":"ContainerDied","Data":"d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09"} Feb 20 00:04:31 crc kubenswrapper[4795]: I0220 00:04:31.943719 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6sqk" event={"ID":"77dfee48-ba66-4c86-80c3-47b740e7e1c3","Type":"ContainerStarted","Data":"221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916"} Feb 20 00:04:31 crc kubenswrapper[4795]: I0220 00:04:31.977838 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t6sqk" podStartSLOduration=3.365831653 podStartE2EDuration="6.977820366s" podCreationTimestamp="2026-02-20 00:04:25 +0000 UTC" firstStartedPulling="2026-02-20 00:04:27.862479948 +0000 UTC m=+9379.054997812" lastFinishedPulling="2026-02-20 00:04:31.474468641 +0000 UTC m=+9382.666986525" observedRunningTime="2026-02-20 00:04:31.967973488 +0000 UTC m=+9383.160491382" watchObservedRunningTime="2026-02-20 00:04:31.977820366 +0000 UTC m=+9383.170338230" Feb 20 00:04:36 crc kubenswrapper[4795]: I0220 00:04:36.184521 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:36 crc kubenswrapper[4795]: I0220 00:04:36.185005 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:36 crc kubenswrapper[4795]: I0220 00:04:36.236533 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:37 crc kubenswrapper[4795]: I0220 00:04:37.057909 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:37 crc kubenswrapper[4795]: I0220 00:04:37.112567 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t6sqk"] Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.019065 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t6sqk" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerName="registry-server" containerID="cri-o://221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916" gracePeriod=2 Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.522582 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:04:39 crc kubenswrapper[4795]: E0220 00:04:39.523423 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.581273 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.623023 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pkd6\" (UniqueName: \"kubernetes.io/projected/77dfee48-ba66-4c86-80c3-47b740e7e1c3-kube-api-access-9pkd6\") pod \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.623233 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-catalog-content\") pod \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.623371 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-utilities\") pod \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\" (UID: \"77dfee48-ba66-4c86-80c3-47b740e7e1c3\") " Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.625096 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-utilities" (OuterVolumeSpecName: "utilities") pod "77dfee48-ba66-4c86-80c3-47b740e7e1c3" (UID: "77dfee48-ba66-4c86-80c3-47b740e7e1c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.632386 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77dfee48-ba66-4c86-80c3-47b740e7e1c3-kube-api-access-9pkd6" (OuterVolumeSpecName: "kube-api-access-9pkd6") pod "77dfee48-ba66-4c86-80c3-47b740e7e1c3" (UID: "77dfee48-ba66-4c86-80c3-47b740e7e1c3"). InnerVolumeSpecName "kube-api-access-9pkd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.685581 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77dfee48-ba66-4c86-80c3-47b740e7e1c3" (UID: "77dfee48-ba66-4c86-80c3-47b740e7e1c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.726085 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pkd6\" (UniqueName: \"kubernetes.io/projected/77dfee48-ba66-4c86-80c3-47b740e7e1c3-kube-api-access-9pkd6\") on node \"crc\" DevicePath \"\"" Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.726127 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 00:04:39 crc kubenswrapper[4795]: I0220 00:04:39.726139 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77dfee48-ba66-4c86-80c3-47b740e7e1c3-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.067710 4795 generic.go:334] "Generic (PLEG): container finished" podID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerID="221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916" exitCode=0 Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.067811 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6sqk" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.067802 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6sqk" event={"ID":"77dfee48-ba66-4c86-80c3-47b740e7e1c3","Type":"ContainerDied","Data":"221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916"} Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.068409 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6sqk" event={"ID":"77dfee48-ba66-4c86-80c3-47b740e7e1c3","Type":"ContainerDied","Data":"7ac04f15babc163d9a654978f2668463945a212502dd56a6aa7609a7a093b1d6"} Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.068431 4795 scope.go:117] "RemoveContainer" containerID="221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.092468 4795 scope.go:117] "RemoveContainer" containerID="d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.117046 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t6sqk"] Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.122987 4795 scope.go:117] "RemoveContainer" containerID="ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.141133 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t6sqk"] Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.176220 4795 scope.go:117] "RemoveContainer" containerID="221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916" Feb 20 00:04:40 crc kubenswrapper[4795]: E0220 00:04:40.176742 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916\": container with ID starting with 221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916 not found: ID does not exist" containerID="221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.176785 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916"} err="failed to get container status \"221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916\": rpc error: code = NotFound desc = could not find container \"221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916\": container with ID starting with 221073919b4e858db2f48f769cfa321f21dd303aee546b34150f0cffd1381916 not found: ID does not exist" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.176808 4795 scope.go:117] "RemoveContainer" containerID="d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09" Feb 20 00:04:40 crc kubenswrapper[4795]: E0220 00:04:40.177223 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09\": container with ID starting with d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09 not found: ID does not exist" containerID="d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.178770 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09"} err="failed to get container status \"d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09\": rpc error: code = NotFound desc = could not find container \"d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09\": container with ID starting with d1a5d4597ee012843d01343335af26b459961559661e432712580b5de4255a09 not found: ID does not exist" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.178826 4795 scope.go:117] "RemoveContainer" containerID="ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae" Feb 20 00:04:40 crc kubenswrapper[4795]: E0220 00:04:40.179239 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae\": container with ID starting with ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae not found: ID does not exist" containerID="ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae" Feb 20 00:04:40 crc kubenswrapper[4795]: I0220 00:04:40.179273 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae"} err="failed to get container status \"ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae\": rpc error: code = NotFound desc = could not find container \"ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae\": container with ID starting with ffbea51d0633b74fb6f4a168487b22e2cd9e687d22b452b5402426703436c8ae not found: ID does not exist" Feb 20 00:04:41 crc kubenswrapper[4795]: I0220 00:04:41.524957 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" path="/var/lib/kubelet/pods/77dfee48-ba66-4c86-80c3-47b740e7e1c3/volumes" Feb 20 00:04:53 crc kubenswrapper[4795]: I0220 00:04:53.513936 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:04:53 crc kubenswrapper[4795]: E0220 00:04:53.514866 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:05:02 crc kubenswrapper[4795]: I0220 00:05:02.042951 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b5b60b6d-7ecf-424d-a297-f98fae5ef0a3/init-config-reloader/0.log" Feb 20 00:05:02 crc kubenswrapper[4795]: I0220 00:05:02.192305 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b5b60b6d-7ecf-424d-a297-f98fae5ef0a3/init-config-reloader/0.log" Feb 20 00:05:02 crc kubenswrapper[4795]: I0220 00:05:02.270002 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b5b60b6d-7ecf-424d-a297-f98fae5ef0a3/alertmanager/0.log" Feb 20 00:05:02 crc kubenswrapper[4795]: I0220 00:05:02.328934 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b5b60b6d-7ecf-424d-a297-f98fae5ef0a3/config-reloader/0.log" Feb 20 00:05:02 crc kubenswrapper[4795]: I0220 00:05:02.462906 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_acb98719-7401-4241-8361-070eb67879c7/aodh-api/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.088501 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_acb98719-7401-4241-8361-070eb67879c7/aodh-evaluator/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.111580 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_acb98719-7401-4241-8361-070eb67879c7/aodh-notifier/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.124364 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_acb98719-7401-4241-8361-070eb67879c7/aodh-listener/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.292243 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-57b58f479d-8dz8t_30f7c03f-5289-48c5-987e-b808897adc6d/barbican-api/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.382723 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-57b58f479d-8dz8t_30f7c03f-5289-48c5-987e-b808897adc6d/barbican-api-log/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.467009 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6b9b47c4f6-2kzbr_5bb2f008-145f-4fc9-9d51-065874ab1b1e/barbican-keystone-listener/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.519082 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6b9b47c4f6-2kzbr_5bb2f008-145f-4fc9-9d51-065874ab1b1e/barbican-keystone-listener-log/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.601326 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c6d9dfdbf-zg9wc_3e6a3af4-fd31-411b-833c-5a39501f5d63/barbican-worker/0.log" Feb 20 00:05:03 crc kubenswrapper[4795]: I0220 00:05:03.897123 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c6d9dfdbf-zg9wc_3e6a3af4-fd31-411b-833c-5a39501f5d63/barbican-worker-log/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.079477 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-rmhfb_a340b6fa-989d-4c4a-ae2f-3fbf6339fdaa/bootstrap-openstack-openstack-cell1/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.186835 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_87a2d0b8-5866-4d88-ab91-cd94c2136c6c/ceilometer-central-agent/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.194752 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_87a2d0b8-5866-4d88-ab91-cd94c2136c6c/ceilometer-notification-agent/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.330364 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_87a2d0b8-5866-4d88-ab91-cd94c2136c6c/sg-core/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.336341 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_87a2d0b8-5866-4d88-ab91-cd94c2136c6c/proxy-httpd/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.396926 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-zjjhg_532484aa-8294-4c2d-b257-082b09bafb14/ceph-client-openstack-openstack-cell1/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.511662 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.636960 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_daeb9555-6d76-45ca-b3da-b6dd91c33e00/cinder-api/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.651584 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_daeb9555-6d76-45ca-b3da-b6dd91c33e00/cinder-api-log/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.915868 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_12de80a7-e42b-4768-83d4-0ed7d7490c30/probe/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.942794 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_12de80a7-e42b-4768-83d4-0ed7d7490c30/cinder-backup/0.log" Feb 20 00:05:04 crc kubenswrapper[4795]: I0220 00:05:04.971799 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-purge-29525761-zbskn_fe324720-6e0b-4d15-bc6e-3875b26bf7f4/cinder-db-purge/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.151388 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_85502c41-99ab-4a8f-9c36-f4d839b931a1/cinder-scheduler/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.223602 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_85502c41-99ab-4a8f-9c36-f4d839b931a1/probe/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.328966 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"bbe670b74801c9c3513113ce23971acfb0538080cd5b8205d654717a9d255d0d"} Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.474337 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_90e22321-4464-4199-b873-8998821a02ed/probe/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.493743 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_90e22321-4464-4199-b873-8998821a02ed/cinder-volume/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.509777 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-2twf8_5a293bce-3326-47c0-a9b5-b5af13dc46c8/configure-network-openstack-openstack-cell1/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.690080 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-lbssf_0d850dd7-a1bb-42db-893b-b96eebee4c9c/configure-os-openstack-openstack-cell1/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.768690 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67d97dc55-pvbjd_f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0/init/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.914725 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67d97dc55-pvbjd_f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0/init/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.948091 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67d97dc55-pvbjd_f23b564a-cce9-4b4f-8d1a-73aa7e2a4be0/dnsmasq-dns/0.log" Feb 20 00:05:05 crc kubenswrapper[4795]: I0220 00:05:05.980004 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-wfcx4_69464400-c61c-41bd-aeeb-984f7f948a16/download-cache-openstack-openstack-cell1/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.186762 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-purge-29525761-lcwsj_9344b7be-b07a-4660-9352-dfdbcecac424/glance-dbpurge/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.202036 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bd4ac280-c0e4-46e3-95c8-5e051c96f32e/glance-httpd/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.254066 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bd4ac280-c0e4-46e3-95c8-5e051c96f32e/glance-log/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.407076 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b7d41b06-abb7-4a30-a29c-3b9d66706d8f/glance-httpd/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.432714 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b7d41b06-abb7-4a30-a29c-3b9d66706d8f/glance-log/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.587726 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-57679899bc-rj6x7_058c5b61-3ec2-4a88-bea8-59843d00750c/heat-api/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.719000 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-6565dd9f4d-w85dm_94ea6e46-bacd-40ca-bce9-0f28656581af/heat-cfnapi/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.775188 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-db-purge-29525761-tq4s7_d673a8e7-fd1c-4bd1-ad6b-fb18a187b5cb/heat-dbpurge/0.log" Feb 20 00:05:06 crc kubenswrapper[4795]: I0220 00:05:06.904054 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-54b48c7f4c-97pnj_a380f130-e904-41e8-90e2-93bdeb0615d6/heat-engine/0.log" Feb 20 00:05:07 crc kubenswrapper[4795]: I0220 00:05:07.061027 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6f75767dd9-c8js2_53ce70ba-9e61-4dbd-b858-7059c82eed67/horizon/0.log" Feb 20 00:05:07 crc kubenswrapper[4795]: I0220 00:05:07.110325 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-tjvng_dc08b8d0-e577-4674-9ca5-b1a02818725c/install-certs-openstack-openstack-cell1/0.log" Feb 20 00:05:07 crc kubenswrapper[4795]: I0220 00:05:07.170401 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6f75767dd9-c8js2_53ce70ba-9e61-4dbd-b858-7059c82eed67/horizon-log/0.log" Feb 20 00:05:07 crc kubenswrapper[4795]: I0220 00:05:07.385880 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-6ctfc_41df3556-7d70-47f5-bd79-bec74fbd269c/install-os-openstack-openstack-cell1/0.log" Feb 20 00:05:07 crc kubenswrapper[4795]: I0220 00:05:07.552960 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-689ff8fbd7-j2v4l_57c39d61-cab0-49e7-8938-06952896387e/keystone-api/0.log" Feb 20 00:05:07 crc kubenswrapper[4795]: I0220 00:05:07.679255 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29525701-nmz5v_5f2d7932-b11f-4e9b-a6e0-2a9a069a3459/keystone-cron/0.log" Feb 20 00:05:07 crc kubenswrapper[4795]: I0220 00:05:07.806135 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29525761-d78q6_e307d045-9890-4475-8c51-395484da10ca/keystone-cron/0.log" Feb 20 00:05:07 crc kubenswrapper[4795]: I0220 00:05:07.907338 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5a2a47de-c40d-40c9-8556-ea7033a4033b/kube-state-metrics/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.016136 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-drll7_b0d21af1-5c6a-4260-ae1b-29b4e0c1cd8d/libvirt-openstack-openstack-cell1/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.163402 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_b7cef1f6-95e4-4ccd-8a2a-49c27373a96d/manila-api-log/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.239525 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_b7cef1f6-95e4-4ccd-8a2a-49c27373a96d/manila-api/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.276730 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-db-purge-29525761-5slgz_8dc6e226-d501-4698-b49c-f07fc8e80339/manila-db-purge/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.433576 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_25554074-26bb-4b62-a1f9-dac4cd6308b4/probe/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.460784 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_25554074-26bb-4b62-a1f9-dac4cd6308b4/manila-scheduler/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.522742 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864/manila-share/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.627071 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_48ba2f2b-8ca4-4bd6-9bc5-0951b1f9f864/probe/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.858381 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7945766d5c-fjptf_dea0417f-0988-4d82-80cc-03298be367bd/neutron-api/0.log" Feb 20 00:05:08 crc kubenswrapper[4795]: I0220 00:05:08.946913 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7945766d5c-fjptf_dea0417f-0988-4d82-80cc-03298be367bd/neutron-httpd/0.log" Feb 20 00:05:09 crc kubenswrapper[4795]: I0220 00:05:09.035760 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-khvh4_ff3df901-a0ae-456e-8103-60aaa6439785/neutron-dhcp-openstack-openstack-cell1/0.log" Feb 20 00:05:09 crc kubenswrapper[4795]: I0220 00:05:09.177429 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-5855g_a23f1a80-1645-454d-b9cf-e039928b84cb/neutron-metadata-openstack-openstack-cell1/0.log" Feb 20 00:05:09 crc kubenswrapper[4795]: I0220 00:05:09.273629 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-56tm4_a29cf217-b932-4515-a8e6-4bb762611d24/neutron-sriov-openstack-openstack-cell1/0.log" Feb 20 00:05:09 crc kubenswrapper[4795]: I0220 00:05:09.626740 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_644df2e5-37fd-468b-9e52-316d44e65f69/nova-api-log/0.log" Feb 20 00:05:09 crc kubenswrapper[4795]: I0220 00:05:09.651437 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_644df2e5-37fd-468b-9e52-316d44e65f69/nova-api-api/0.log" Feb 20 00:05:09 crc kubenswrapper[4795]: I0220 00:05:09.732342 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_d27d8041-4940-4cd2-bf9e-02b7aa924067/nova-cell0-conductor-conductor/0.log" Feb 20 00:05:09 crc kubenswrapper[4795]: I0220 00:05:09.851306 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-purge-29525760-n9sd6_57de3f43-e33f-4734-b02d-372d013b7e80/nova-manage/0.log" Feb 20 00:05:10 crc kubenswrapper[4795]: I0220 00:05:10.042472 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_0fa49294-8a0c-4d98-a388-067bdce0ac1b/nova-cell1-conductor-conductor/0.log" Feb 20 00:05:10 crc kubenswrapper[4795]: I0220 00:05:10.071456 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-purge-29525760-9s2b5_2fbbf12e-019a-40d4-9a07-46b3e5b4c814/nova-manage/0.log" Feb 20 00:05:10 crc kubenswrapper[4795]: I0220 00:05:10.341152 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a96a8189-2b04-4ce7-908b-3544dc3b7ec4/nova-cell1-novncproxy-novncproxy/0.log" Feb 20 00:05:10 crc kubenswrapper[4795]: I0220 00:05:10.378730 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellxz55h_59981ca7-620e-4025-b165-4f54f920e8f2/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Feb 20 00:05:10 crc kubenswrapper[4795]: I0220 00:05:10.568400 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-q26dr_df818d88-cec5-4daf-8b17-cc4bb298b498/nova-cell1-openstack-openstack-cell1/0.log" Feb 20 00:05:10 crc kubenswrapper[4795]: I0220 00:05:10.696796 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d15a66bd-d8e7-4ad0-a8bc-7575a218f50c/nova-metadata-log/0.log" Feb 20 00:05:10 crc kubenswrapper[4795]: I0220 00:05:10.792823 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_d15a66bd-d8e7-4ad0-a8bc-7575a218f50c/nova-metadata-metadata/0.log" Feb 20 00:05:11 crc kubenswrapper[4795]: I0220 00:05:11.328825 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-556fc55b45-7gxcm_c95e6fdb-6007-4490-9572-a2709f8b7daf/init/0.log" Feb 20 00:05:11 crc kubenswrapper[4795]: I0220 00:05:11.373058 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d16cd452-43cb-42e4-b4af-6de3271d7194/nova-scheduler-scheduler/0.log" Feb 20 00:05:11 crc kubenswrapper[4795]: I0220 00:05:11.645938 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-556fc55b45-7gxcm_c95e6fdb-6007-4490-9572-a2709f8b7daf/init/0.log" Feb 20 00:05:11 crc kubenswrapper[4795]: I0220 00:05:11.847422 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-556fc55b45-7gxcm_c95e6fdb-6007-4490-9572-a2709f8b7daf/octavia-api-provider-agent/0.log" Feb 20 00:05:11 crc kubenswrapper[4795]: I0220 00:05:11.880415 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-g59hr_107db266-c130-4312-be67-ffe75016fd44/init/0.log" Feb 20 00:05:11 crc kubenswrapper[4795]: I0220 00:05:11.995449 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-556fc55b45-7gxcm_c95e6fdb-6007-4490-9572-a2709f8b7daf/octavia-api/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.049575 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-g59hr_107db266-c130-4312-be67-ffe75016fd44/init/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.160866 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-g59hr_107db266-c130-4312-be67-ffe75016fd44/octavia-healthmanager/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.232847 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-cm4g6_7167d9ee-5127-43c9-957a-598d9dcfecb3/init/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.499159 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-cm4g6_7167d9ee-5127-43c9-957a-598d9dcfecb3/octavia-housekeeping/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.543808 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-8d4564f8f-vp2hp_73c5ad0c-a7f2-414d-a1f8-041a807d82b9/init/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.559867 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-cm4g6_7167d9ee-5127-43c9-957a-598d9dcfecb3/init/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.735518 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-8d4564f8f-vp2hp_73c5ad0c-a7f2-414d-a1f8-041a807d82b9/octavia-amphora-httpd/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.796834 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-8d4564f8f-vp2hp_73c5ad0c-a7f2-414d-a1f8-041a807d82b9/init/0.log" Feb 20 00:05:12 crc kubenswrapper[4795]: I0220 00:05:12.828127 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-gzbmf_b3ffcac3-ee64-440c-983d-67404e5f47fd/init/0.log" Feb 20 00:05:13 crc kubenswrapper[4795]: I0220 00:05:13.320430 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-gzbmf_b3ffcac3-ee64-440c-983d-67404e5f47fd/octavia-rsyslog/0.log" Feb 20 00:05:13 crc kubenswrapper[4795]: I0220 00:05:13.364619 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-gzbmf_b3ffcac3-ee64-440c-983d-67404e5f47fd/init/0.log" Feb 20 00:05:13 crc kubenswrapper[4795]: I0220 00:05:13.383418 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-pktjg_cfbfd9d0-564e-41d0-8171-5f32f380a3df/init/0.log" Feb 20 00:05:13 crc kubenswrapper[4795]: I0220 00:05:13.585762 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-pktjg_cfbfd9d0-564e-41d0-8171-5f32f380a3df/init/0.log" Feb 20 00:05:13 crc kubenswrapper[4795]: I0220 00:05:13.692865 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c8f55130-d799-45ef-b174-450b6c3b52ff/mysql-bootstrap/0.log" Feb 20 00:05:13 crc kubenswrapper[4795]: I0220 00:05:13.769332 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-pktjg_cfbfd9d0-564e-41d0-8171-5f32f380a3df/octavia-worker/0.log" Feb 20 00:05:13 crc kubenswrapper[4795]: I0220 00:05:13.948094 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c8f55130-d799-45ef-b174-450b6c3b52ff/galera/0.log" Feb 20 00:05:13 crc kubenswrapper[4795]: I0220 00:05:13.973191 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c8f55130-d799-45ef-b174-450b6c3b52ff/mysql-bootstrap/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.050485 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_24345708-df30-4486-bc7e-44eaa7722ffd/mysql-bootstrap/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.217041 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_24345708-df30-4486-bc7e-44eaa7722ffd/mysql-bootstrap/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.309379 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_f1d06b1e-9114-47b8-913d-86144f6314c3/openstackclient/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.309469 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_24345708-df30-4486-bc7e-44eaa7722ffd/galera/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.516986 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-knqfl_3bbc323f-3f18-42bc-b0d8-12f021d91d6b/ovn-controller/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.575521 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-kx9qd_b48804d5-a275-45dd-896c-f35b7a322690/openstack-network-exporter/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.745138 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lrv52_9e98c62c-20fc-462c-9973-2616cb184032/ovsdb-server-init/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.982271 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lrv52_9e98c62c-20fc-462c-9973-2616cb184032/ovsdb-server/0.log" Feb 20 00:05:14 crc kubenswrapper[4795]: I0220 00:05:14.995272 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lrv52_9e98c62c-20fc-462c-9973-2616cb184032/ovsdb-server-init/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.005434 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lrv52_9e98c62c-20fc-462c-9973-2616cb184032/ovs-vswitchd/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.196655 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b989c1be-7a74-42ee-a27b-dc34ce8d727a/openstack-network-exporter/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.198940 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b989c1be-7a74-42ee-a27b-dc34ce8d727a/ovn-northd/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.300270 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-b9rq5_d02efd94-2196-48fe-85d5-e2c65d186d6e/ovn-openstack-openstack-cell1/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.401050 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9b32c19b-2b8b-4587-9327-1ddf5b074ad6/openstack-network-exporter/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.536028 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9b32c19b-2b8b-4587-9327-1ddf5b074ad6/ovsdbserver-nb/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.639944 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_f814768e-2961-4d2a-ba3b-615dea717cf8/openstack-network-exporter/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.643126 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_f814768e-2961-4d2a-ba3b-615dea717cf8/ovsdbserver-nb/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.771680 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_188a11e4-50de-4672-baaf-89a3a512cd0c/openstack-network-exporter/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.891725 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_188a11e4-50de-4672-baaf-89a3a512cd0c/ovsdbserver-nb/0.log" Feb 20 00:05:15 crc kubenswrapper[4795]: I0220 00:05:15.995317 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_924d2a8a-2ae7-417a-9770-054662474286/ovsdbserver-sb/0.log" Feb 20 00:05:16 crc kubenswrapper[4795]: I0220 00:05:16.024894 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_924d2a8a-2ae7-417a-9770-054662474286/openstack-network-exporter/0.log" Feb 20 00:05:16 crc kubenswrapper[4795]: I0220 00:05:16.160234 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_9c5bddd7-705d-41b3-ad43-1889c6c34ab0/openstack-network-exporter/0.log" Feb 20 00:05:16 crc kubenswrapper[4795]: I0220 00:05:16.237931 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_9c5bddd7-705d-41b3-ad43-1889c6c34ab0/ovsdbserver-sb/0.log" Feb 20 00:05:16 crc kubenswrapper[4795]: I0220 00:05:16.388628 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_9c8cf9f5-7499-4c52-9710-91b96d49b0fc/openstack-network-exporter/0.log" Feb 20 00:05:16 crc kubenswrapper[4795]: I0220 00:05:16.391782 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_9c8cf9f5-7499-4c52-9710-91b96d49b0fc/ovsdbserver-sb/0.log" Feb 20 00:05:16 crc kubenswrapper[4795]: I0220 00:05:16.623432 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-764895875b-czlhk_f4bb335d-ad73-403a-a25f-8e6f33f60ecb/placement-api/0.log" Feb 20 00:05:16 crc kubenswrapper[4795]: I0220 00:05:16.714372 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-764895875b-czlhk_f4bb335d-ad73-403a-a25f-8e6f33f60ecb/placement-log/0.log" Feb 20 00:05:16 crc kubenswrapper[4795]: I0220 00:05:16.950136 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-ckn92n_7ba2e854-6881-4f7f-8068-7abf4df26229/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.092292 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_281b5fc0-7da4-4d5a-89d4-b073b1500865/init-config-reloader/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.274250 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_281b5fc0-7da4-4d5a-89d4-b073b1500865/init-config-reloader/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.302971 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_281b5fc0-7da4-4d5a-89d4-b073b1500865/config-reloader/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.341454 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_281b5fc0-7da4-4d5a-89d4-b073b1500865/prometheus/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.362273 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_281b5fc0-7da4-4d5a-89d4-b073b1500865/thanos-sidecar/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.534703 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ce76f8f5-4383-4be1-ab7b-cf862ae77025/setup-container/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.729243 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ce76f8f5-4383-4be1-ab7b-cf862ae77025/setup-container/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.805727 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b/setup-container/0.log" Feb 20 00:05:17 crc kubenswrapper[4795]: I0220 00:05:17.891365 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ce76f8f5-4383-4be1-ab7b-cf862ae77025/rabbitmq/0.log" Feb 20 00:05:18 crc kubenswrapper[4795]: I0220 00:05:18.019523 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b/setup-container/0.log" Feb 20 00:05:18 crc kubenswrapper[4795]: I0220 00:05:18.046837 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8c4cf8fb-c71a-4741-bc31-f5505ab7fa6b/rabbitmq/0.log" Feb 20 00:05:18 crc kubenswrapper[4795]: I0220 00:05:18.160848 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-bdhcf_99fb1ef3-d414-4a7e-9db8-54edf1aad197/reboot-os-openstack-openstack-cell1/0.log" Feb 20 00:05:18 crc kubenswrapper[4795]: I0220 00:05:18.270923 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-7vb2g_d82522ab-bf1a-47f9-902b-c82105b5d09b/run-os-openstack-openstack-cell1/0.log" Feb 20 00:05:18 crc kubenswrapper[4795]: I0220 00:05:18.444952 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-qzhll_adb280f6-14e8-45d3-91a1-1bf325d84aef/ssh-known-hosts-openstack/0.log" Feb 20 00:05:18 crc kubenswrapper[4795]: I0220 00:05:18.615081 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-45fwk_8272a408-0416-4077-9e85-b2962992b3f4/telemetry-openstack-openstack-cell1/0.log" Feb 20 00:05:18 crc kubenswrapper[4795]: I0220 00:05:18.816810 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-z6rcg_c3cbdd11-d93f-4025-9c08-7530a68f6113/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Feb 20 00:05:18 crc kubenswrapper[4795]: I0220 00:05:18.913091 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-44t2w_94dbf6be-911e-46d9-a950-fa19fa137490/validate-network-openstack-openstack-cell1/0.log" Feb 20 00:05:19 crc kubenswrapper[4795]: I0220 00:05:19.487931 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3d516d65-1efc-42ee-ab17-971e2d94e4a7/memcached/0.log" Feb 20 00:05:42 crc kubenswrapper[4795]: I0220 00:05:42.422059 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4_6afab948-ae77-464b-aa33-b8d45ddc01ff/util/0.log" Feb 20 00:05:42 crc kubenswrapper[4795]: I0220 00:05:42.637226 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4_6afab948-ae77-464b-aa33-b8d45ddc01ff/util/0.log" Feb 20 00:05:42 crc kubenswrapper[4795]: I0220 00:05:42.698528 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4_6afab948-ae77-464b-aa33-b8d45ddc01ff/pull/0.log" Feb 20 00:05:42 crc kubenswrapper[4795]: I0220 00:05:42.698915 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4_6afab948-ae77-464b-aa33-b8d45ddc01ff/pull/0.log" Feb 20 00:05:42 crc kubenswrapper[4795]: I0220 00:05:42.896599 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4_6afab948-ae77-464b-aa33-b8d45ddc01ff/pull/0.log" Feb 20 00:05:42 crc kubenswrapper[4795]: I0220 00:05:42.917801 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4_6afab948-ae77-464b-aa33-b8d45ddc01ff/util/0.log" Feb 20 00:05:42 crc kubenswrapper[4795]: I0220 00:05:42.918623 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967zj6x4_6afab948-ae77-464b-aa33-b8d45ddc01ff/extract/0.log" Feb 20 00:05:43 crc kubenswrapper[4795]: I0220 00:05:43.385540 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-d8wqs_5c867f91-2ab2-43ce-8291-6d01825610d1/manager/0.log" Feb 20 00:05:43 crc kubenswrapper[4795]: I0220 00:05:43.954836 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-5cnjr_d19ed31e-e599-40ec-935d-d1d404e4c7a5/manager/0.log" Feb 20 00:05:44 crc kubenswrapper[4795]: I0220 00:05:44.024705 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-shb4d_268c2664-09cc-4616-9280-0dd6ae4159dc/manager/0.log" Feb 20 00:05:44 crc kubenswrapper[4795]: I0220 00:05:44.344669 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-fdd85_e37494c1-8780-4612-8569-fada28f0e772/manager/0.log" Feb 20 00:05:44 crc kubenswrapper[4795]: I0220 00:05:44.848367 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-z7hnk_4cc5be3d-87d8-46a4-ba7d-d95143c11857/manager/0.log" Feb 20 00:05:44 crc kubenswrapper[4795]: I0220 00:05:44.995078 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-t87bb_02592cbe-e1d4-4b62-8795-a204d5335594/manager/0.log" Feb 20 00:05:45 crc kubenswrapper[4795]: I0220 00:05:45.426903 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-t6hpt_c7e19956-a3fb-4ed2-bc2a-72084ed62ac2/manager/0.log" Feb 20 00:05:45 crc kubenswrapper[4795]: I0220 00:05:45.496691 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-qjgvw_2e80963b-888b-4bb9-9259-864e38dd10ed/manager/0.log" Feb 20 00:05:45 crc kubenswrapper[4795]: I0220 00:05:45.626485 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-7n98g_1d6085d5-f9db-4129-8662-b3ae045decfc/manager/0.log" Feb 20 00:05:45 crc kubenswrapper[4795]: I0220 00:05:45.827415 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-mr8mh_0bdb1789-27ad-4535-86d3-fd2fb7cebba2/manager/0.log" Feb 20 00:05:45 crc kubenswrapper[4795]: I0220 00:05:45.965560 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-4cf2p_c2c4435e-a135-4c1f-bad4-121458c09bc3/manager/0.log" Feb 20 00:05:46 crc kubenswrapper[4795]: I0220 00:05:46.425450 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-5b89b_5a6d3cc3-7e00-4013-b568-c2b835d8e2b9/manager/0.log" Feb 20 00:05:46 crc kubenswrapper[4795]: I0220 00:05:46.773284 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-fb5fcc5b8-qlz2p_26db9cb2-1ed4-44e4-afac-404ce0f7d445/manager/0.log" Feb 20 00:05:47 crc kubenswrapper[4795]: I0220 00:05:47.316332 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-vwgdm_54a55994-69ff-48f1-8d75-24b2a828cdc9/manager/0.log" Feb 20 00:05:47 crc kubenswrapper[4795]: I0220 00:05:47.420026 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6679bf9b57-vbjcq_c6c44d2f-3e8f-42de-babe-85a8fc1a97ec/operator/0.log" Feb 20 00:05:47 crc kubenswrapper[4795]: I0220 00:05:47.715385 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-tf75g_91c93ffc-fbe2-486e-92a9-ca5737dc7875/registry-server/0.log" Feb 20 00:05:47 crc kubenswrapper[4795]: I0220 00:05:47.827858 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-slqxz_e0cad59b-249e-446f-b3fa-6be8aac2a858/manager/0.log" Feb 20 00:05:47 crc kubenswrapper[4795]: I0220 00:05:47.975651 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-bwsj2_7b637620-f307-4e2b-b92d-f1e0d50b0071/manager/0.log" Feb 20 00:05:48 crc kubenswrapper[4795]: I0220 00:05:48.092064 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-vv89z_80ce3bc1-0926-47a3-acc2-6f2d8be4089c/operator/0.log" Feb 20 00:05:48 crc kubenswrapper[4795]: I0220 00:05:48.219355 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-dqpjx_98979ac7-9fb1-49f8-8022-562082fc76f7/manager/0.log" Feb 20 00:05:48 crc kubenswrapper[4795]: I0220 00:05:48.541347 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-rcjgz_6bdc9c62-d8c1-42d5-8696-324fdc7abc2f/manager/0.log" Feb 20 00:05:48 crc kubenswrapper[4795]: I0220 00:05:48.682103 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-slj65_5f4d8698-27a0-44a4-87f6-c75d4c3407bc/manager/0.log" Feb 20 00:05:49 crc kubenswrapper[4795]: I0220 00:05:49.263729 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-bbtgm_09ce2dcf-0fb0-4180-a019-09d1abfec00e/manager/0.log" Feb 20 00:05:50 crc kubenswrapper[4795]: I0220 00:05:50.610331 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-69ff7bc449-62xdp_9f7c4d4b-cf83-47e2-a75f-e3a2c9658bb4/manager/0.log" Feb 20 00:05:50 crc kubenswrapper[4795]: I0220 00:05:50.837564 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-ckxlw_b22b5096-41cf-40c9-94f6-8e546ca96a96/manager/0.log" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.466353 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7hdhn"] Feb 20 00:05:57 crc kubenswrapper[4795]: E0220 00:05:57.467370 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerName="extract-utilities" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.467384 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerName="extract-utilities" Feb 20 00:05:57 crc kubenswrapper[4795]: E0220 00:05:57.467408 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerName="registry-server" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.467413 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerName="registry-server" Feb 20 00:05:57 crc kubenswrapper[4795]: E0220 00:05:57.467447 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerName="extract-content" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.467454 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerName="extract-content" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.467631 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="77dfee48-ba66-4c86-80c3-47b740e7e1c3" containerName="registry-server" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.471783 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.489696 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7hdhn"] Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.546246 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-catalog-content\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.546316 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-utilities\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.546394 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-448mb\" (UniqueName: \"kubernetes.io/projected/20cdd942-7d19-4f30-9952-d7f228b9ce25-kube-api-access-448mb\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.648749 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-catalog-content\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.648828 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-utilities\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.648936 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-448mb\" (UniqueName: \"kubernetes.io/projected/20cdd942-7d19-4f30-9952-d7f228b9ce25-kube-api-access-448mb\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.649628 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-utilities\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.649667 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-catalog-content\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.680225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-448mb\" (UniqueName: \"kubernetes.io/projected/20cdd942-7d19-4f30-9952-d7f228b9ce25-kube-api-access-448mb\") pod \"redhat-operators-7hdhn\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:57 crc kubenswrapper[4795]: I0220 00:05:57.795568 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:05:58 crc kubenswrapper[4795]: I0220 00:05:58.297343 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7hdhn"] Feb 20 00:05:58 crc kubenswrapper[4795]: W0220 00:05:58.309074 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20cdd942_7d19_4f30_9952_d7f228b9ce25.slice/crio-7b602e85bc557e898c3ca566f65c7dc4a8e26a4c0fa1a330c88c268287d48d81 WatchSource:0}: Error finding container 7b602e85bc557e898c3ca566f65c7dc4a8e26a4c0fa1a330c88c268287d48d81: Status 404 returned error can't find the container with id 7b602e85bc557e898c3ca566f65c7dc4a8e26a4c0fa1a330c88c268287d48d81 Feb 20 00:05:58 crc kubenswrapper[4795]: I0220 00:05:58.835105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hdhn" event={"ID":"20cdd942-7d19-4f30-9952-d7f228b9ce25","Type":"ContainerStarted","Data":"5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f"} Feb 20 00:05:58 crc kubenswrapper[4795]: I0220 00:05:58.835432 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hdhn" event={"ID":"20cdd942-7d19-4f30-9952-d7f228b9ce25","Type":"ContainerStarted","Data":"7b602e85bc557e898c3ca566f65c7dc4a8e26a4c0fa1a330c88c268287d48d81"} Feb 20 00:05:59 crc kubenswrapper[4795]: I0220 00:05:59.850489 4795 generic.go:334] "Generic (PLEG): container finished" podID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerID="5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f" exitCode=0 Feb 20 00:05:59 crc kubenswrapper[4795]: I0220 00:05:59.850590 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hdhn" event={"ID":"20cdd942-7d19-4f30-9952-d7f228b9ce25","Type":"ContainerDied","Data":"5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f"} Feb 20 00:06:01 crc kubenswrapper[4795]: I0220 00:06:01.873273 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hdhn" event={"ID":"20cdd942-7d19-4f30-9952-d7f228b9ce25","Type":"ContainerStarted","Data":"3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822"} Feb 20 00:06:04 crc kubenswrapper[4795]: I0220 00:06:04.911986 4795 generic.go:334] "Generic (PLEG): container finished" podID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerID="3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822" exitCode=0 Feb 20 00:06:04 crc kubenswrapper[4795]: I0220 00:06:04.912079 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hdhn" event={"ID":"20cdd942-7d19-4f30-9952-d7f228b9ce25","Type":"ContainerDied","Data":"3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822"} Feb 20 00:06:05 crc kubenswrapper[4795]: I0220 00:06:05.923108 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hdhn" event={"ID":"20cdd942-7d19-4f30-9952-d7f228b9ce25","Type":"ContainerStarted","Data":"2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5"} Feb 20 00:06:05 crc kubenswrapper[4795]: I0220 00:06:05.940906 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7hdhn" podStartSLOduration=3.462705052 podStartE2EDuration="8.940884878s" podCreationTimestamp="2026-02-20 00:05:57 +0000 UTC" firstStartedPulling="2026-02-20 00:05:59.853142657 +0000 UTC m=+9471.045660521" lastFinishedPulling="2026-02-20 00:06:05.331322483 +0000 UTC m=+9476.523840347" observedRunningTime="2026-02-20 00:06:05.939870299 +0000 UTC m=+9477.132388163" watchObservedRunningTime="2026-02-20 00:06:05.940884878 +0000 UTC m=+9477.133402742" Feb 20 00:06:07 crc kubenswrapper[4795]: I0220 00:06:07.796005 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:06:07 crc kubenswrapper[4795]: I0220 00:06:07.796613 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:06:08 crc kubenswrapper[4795]: I0220 00:06:08.845252 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7hdhn" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="registry-server" probeResult="failure" output=< Feb 20 00:06:08 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 20 00:06:08 crc kubenswrapper[4795]: > Feb 20 00:06:10 crc kubenswrapper[4795]: I0220 00:06:10.494613 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-42wfj_0da0af7f-f8f8-492d-bd44-1e81ab242a24/control-plane-machine-set-operator/0.log" Feb 20 00:06:10 crc kubenswrapper[4795]: I0220 00:06:10.685486 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7mzng_ab78fbf6-65df-4306-a7b8-c7bd98cfdf49/kube-rbac-proxy/0.log" Feb 20 00:06:10 crc kubenswrapper[4795]: I0220 00:06:10.719184 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7mzng_ab78fbf6-65df-4306-a7b8-c7bd98cfdf49/machine-api-operator/0.log" Feb 20 00:06:17 crc kubenswrapper[4795]: I0220 00:06:17.835927 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wqgfs"] Feb 20 00:06:17 crc kubenswrapper[4795]: I0220 00:06:17.839598 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:17 crc kubenswrapper[4795]: I0220 00:06:17.849728 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wqgfs"] Feb 20 00:06:17 crc kubenswrapper[4795]: I0220 00:06:17.916849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9gk7\" (UniqueName: \"kubernetes.io/projected/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-kube-api-access-v9gk7\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:17 crc kubenswrapper[4795]: I0220 00:06:17.916928 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-utilities\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:17 crc kubenswrapper[4795]: I0220 00:06:17.916963 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-catalog-content\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.020732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9gk7\" (UniqueName: \"kubernetes.io/projected/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-kube-api-access-v9gk7\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.020826 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-utilities\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.020848 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-catalog-content\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.021418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-utilities\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.021620 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-catalog-content\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.043993 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9gk7\" (UniqueName: \"kubernetes.io/projected/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-kube-api-access-v9gk7\") pod \"community-operators-wqgfs\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.177401 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.785966 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wqgfs"] Feb 20 00:06:18 crc kubenswrapper[4795]: I0220 00:06:18.869828 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7hdhn" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="registry-server" probeResult="failure" output=< Feb 20 00:06:18 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Feb 20 00:06:18 crc kubenswrapper[4795]: > Feb 20 00:06:20 crc kubenswrapper[4795]: I0220 00:06:20.058859 4795 generic.go:334] "Generic (PLEG): container finished" podID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerID="29b21899c393a3503ba5ae6dcf3912d8e17c4a592516cd085863f44ad997cd91" exitCode=0 Feb 20 00:06:20 crc kubenswrapper[4795]: I0220 00:06:20.059018 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqgfs" event={"ID":"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca","Type":"ContainerDied","Data":"29b21899c393a3503ba5ae6dcf3912d8e17c4a592516cd085863f44ad997cd91"} Feb 20 00:06:20 crc kubenswrapper[4795]: I0220 00:06:20.059400 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqgfs" event={"ID":"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca","Type":"ContainerStarted","Data":"c26d2d03a0a7f24ca832726bbb2107bca07d829795090d5a9ba6e14ef44f0230"} Feb 20 00:06:22 crc kubenswrapper[4795]: I0220 00:06:22.094998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqgfs" event={"ID":"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca","Type":"ContainerStarted","Data":"1bcbbf1fb3a51b7809b2c9d3857d7d30bd666a5c7ea1fd4ea2c2765511ea4288"} Feb 20 00:06:23 crc kubenswrapper[4795]: I0220 00:06:23.109669 4795 generic.go:334] "Generic (PLEG): container finished" podID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerID="1bcbbf1fb3a51b7809b2c9d3857d7d30bd666a5c7ea1fd4ea2c2765511ea4288" exitCode=0 Feb 20 00:06:23 crc kubenswrapper[4795]: I0220 00:06:23.109762 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqgfs" event={"ID":"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca","Type":"ContainerDied","Data":"1bcbbf1fb3a51b7809b2c9d3857d7d30bd666a5c7ea1fd4ea2c2765511ea4288"} Feb 20 00:06:24 crc kubenswrapper[4795]: I0220 00:06:24.121246 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqgfs" event={"ID":"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca","Type":"ContainerStarted","Data":"8c4945d0f178c5671668c808efa2896dd2d5275ec4ad197350f883d68eb9ae13"} Feb 20 00:06:24 crc kubenswrapper[4795]: I0220 00:06:24.148306 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wqgfs" podStartSLOduration=3.684358736 podStartE2EDuration="7.148288457s" podCreationTimestamp="2026-02-20 00:06:17 +0000 UTC" firstStartedPulling="2026-02-20 00:06:20.061304952 +0000 UTC m=+9491.253822806" lastFinishedPulling="2026-02-20 00:06:23.525234663 +0000 UTC m=+9494.717752527" observedRunningTime="2026-02-20 00:06:24.138367398 +0000 UTC m=+9495.330885262" watchObservedRunningTime="2026-02-20 00:06:24.148288457 +0000 UTC m=+9495.340806321" Feb 20 00:06:24 crc kubenswrapper[4795]: I0220 00:06:24.269291 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-jdbs7_c1df7da5-3926-430a-8085-202bccbc4d73/cert-manager-controller/0.log" Feb 20 00:06:24 crc kubenswrapper[4795]: I0220 00:06:24.565114 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-bjb5c_7b645fbc-f078-4ebe-958d-cd7a8f8ba1ee/cert-manager-webhook/0.log" Feb 20 00:06:24 crc kubenswrapper[4795]: I0220 00:06:24.727645 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-n8skq_35b44919-239d-4fe8-8c53-a3698e24f753/cert-manager-cainjector/0.log" Feb 20 00:06:27 crc kubenswrapper[4795]: I0220 00:06:27.851106 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:06:27 crc kubenswrapper[4795]: I0220 00:06:27.911839 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:06:28 crc kubenswrapper[4795]: I0220 00:06:28.178091 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:28 crc kubenswrapper[4795]: I0220 00:06:28.178148 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:28 crc kubenswrapper[4795]: I0220 00:06:28.222362 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.069262 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7hdhn"] Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.166769 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7hdhn" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="registry-server" containerID="cri-o://2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5" gracePeriod=2 Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.226565 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.673365 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.763127 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-448mb\" (UniqueName: \"kubernetes.io/projected/20cdd942-7d19-4f30-9952-d7f228b9ce25-kube-api-access-448mb\") pod \"20cdd942-7d19-4f30-9952-d7f228b9ce25\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.763523 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-catalog-content\") pod \"20cdd942-7d19-4f30-9952-d7f228b9ce25\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.763603 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-utilities\") pod \"20cdd942-7d19-4f30-9952-d7f228b9ce25\" (UID: \"20cdd942-7d19-4f30-9952-d7f228b9ce25\") " Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.764182 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-utilities" (OuterVolumeSpecName: "utilities") pod "20cdd942-7d19-4f30-9952-d7f228b9ce25" (UID: "20cdd942-7d19-4f30-9952-d7f228b9ce25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.866026 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.880911 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20cdd942-7d19-4f30-9952-d7f228b9ce25" (UID: "20cdd942-7d19-4f30-9952-d7f228b9ce25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:06:29 crc kubenswrapper[4795]: I0220 00:06:29.968598 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20cdd942-7d19-4f30-9952-d7f228b9ce25-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.178071 4795 generic.go:334] "Generic (PLEG): container finished" podID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerID="2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5" exitCode=0 Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.178196 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7hdhn" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.178281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hdhn" event={"ID":"20cdd942-7d19-4f30-9952-d7f228b9ce25","Type":"ContainerDied","Data":"2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5"} Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.178310 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hdhn" event={"ID":"20cdd942-7d19-4f30-9952-d7f228b9ce25","Type":"ContainerDied","Data":"7b602e85bc557e898c3ca566f65c7dc4a8e26a4c0fa1a330c88c268287d48d81"} Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.178349 4795 scope.go:117] "RemoveContainer" containerID="2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.197327 4795 scope.go:117] "RemoveContainer" containerID="3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.332625 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20cdd942-7d19-4f30-9952-d7f228b9ce25-kube-api-access-448mb" (OuterVolumeSpecName: "kube-api-access-448mb") pod "20cdd942-7d19-4f30-9952-d7f228b9ce25" (UID: "20cdd942-7d19-4f30-9952-d7f228b9ce25"). InnerVolumeSpecName "kube-api-access-448mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.347646 4795 scope.go:117] "RemoveContainer" containerID="5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.378468 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-448mb\" (UniqueName: \"kubernetes.io/projected/20cdd942-7d19-4f30-9952-d7f228b9ce25-kube-api-access-448mb\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.430219 4795 scope.go:117] "RemoveContainer" containerID="2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5" Feb 20 00:06:30 crc kubenswrapper[4795]: E0220 00:06:30.430615 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5\": container with ID starting with 2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5 not found: ID does not exist" containerID="2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.430638 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5"} err="failed to get container status \"2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5\": rpc error: code = NotFound desc = could not find container \"2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5\": container with ID starting with 2a956df3d891332966df5ad18e909d8da26b1727c8e93c48b762296a8b0255d5 not found: ID does not exist" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.430658 4795 scope.go:117] "RemoveContainer" containerID="3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822" Feb 20 00:06:30 crc kubenswrapper[4795]: E0220 00:06:30.431108 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822\": container with ID starting with 3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822 not found: ID does not exist" containerID="3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.431134 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822"} err="failed to get container status \"3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822\": rpc error: code = NotFound desc = could not find container \"3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822\": container with ID starting with 3b20e09d048f1337f24d9d1e276d0fa4a7957311695be0063853e435b8104822 not found: ID does not exist" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.431149 4795 scope.go:117] "RemoveContainer" containerID="5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f" Feb 20 00:06:30 crc kubenswrapper[4795]: E0220 00:06:30.431543 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f\": container with ID starting with 5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f not found: ID does not exist" containerID="5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.431645 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f"} err="failed to get container status \"5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f\": rpc error: code = NotFound desc = could not find container \"5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f\": container with ID starting with 5b1794b84c59b23398b947ba30b2bc5663729806a9fcfb22249c57f13530b50f not found: ID does not exist" Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.469094 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wqgfs"] Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.518663 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7hdhn"] Feb 20 00:06:30 crc kubenswrapper[4795]: I0220 00:06:30.528346 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7hdhn"] Feb 20 00:06:31 crc kubenswrapper[4795]: I0220 00:06:31.188901 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wqgfs" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerName="registry-server" containerID="cri-o://8c4945d0f178c5671668c808efa2896dd2d5275ec4ad197350f883d68eb9ae13" gracePeriod=2 Feb 20 00:06:31 crc kubenswrapper[4795]: I0220 00:06:31.524568 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" path="/var/lib/kubelet/pods/20cdd942-7d19-4f30-9952-d7f228b9ce25/volumes" Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.206308 4795 generic.go:334] "Generic (PLEG): container finished" podID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerID="8c4945d0f178c5671668c808efa2896dd2d5275ec4ad197350f883d68eb9ae13" exitCode=0 Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.206689 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqgfs" event={"ID":"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca","Type":"ContainerDied","Data":"8c4945d0f178c5671668c808efa2896dd2d5275ec4ad197350f883d68eb9ae13"} Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.206716 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqgfs" event={"ID":"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca","Type":"ContainerDied","Data":"c26d2d03a0a7f24ca832726bbb2107bca07d829795090d5a9ba6e14ef44f0230"} Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.206728 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c26d2d03a0a7f24ca832726bbb2107bca07d829795090d5a9ba6e14ef44f0230" Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.376370 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.523630 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9gk7\" (UniqueName: \"kubernetes.io/projected/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-kube-api-access-v9gk7\") pod \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.523732 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-utilities\") pod \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.523892 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-catalog-content\") pod \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\" (UID: \"40dbe861-a8a0-4f3f-9d04-5a592dbce6ca\") " Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.524386 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-utilities" (OuterVolumeSpecName: "utilities") pod "40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" (UID: "40dbe861-a8a0-4f3f-9d04-5a592dbce6ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.556693 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-kube-api-access-v9gk7" (OuterVolumeSpecName: "kube-api-access-v9gk7") pod "40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" (UID: "40dbe861-a8a0-4f3f-9d04-5a592dbce6ca"). InnerVolumeSpecName "kube-api-access-v9gk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.592255 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" (UID: "40dbe861-a8a0-4f3f-9d04-5a592dbce6ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.626741 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9gk7\" (UniqueName: \"kubernetes.io/projected/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-kube-api-access-v9gk7\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.626779 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:32 crc kubenswrapper[4795]: I0220 00:06:32.626791 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 00:06:33 crc kubenswrapper[4795]: I0220 00:06:33.218055 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqgfs" Feb 20 00:06:33 crc kubenswrapper[4795]: I0220 00:06:33.261508 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wqgfs"] Feb 20 00:06:33 crc kubenswrapper[4795]: I0220 00:06:33.273354 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wqgfs"] Feb 20 00:06:33 crc kubenswrapper[4795]: I0220 00:06:33.526541 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" path="/var/lib/kubelet/pods/40dbe861-a8a0-4f3f-9d04-5a592dbce6ca/volumes" Feb 20 00:06:38 crc kubenswrapper[4795]: I0220 00:06:38.293905 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-gp2td_f991b7a1-8af0-4ba7-aa0a-a5e08bdc11d8/nmstate-console-plugin/0.log" Feb 20 00:06:38 crc kubenswrapper[4795]: I0220 00:06:38.477924 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-zqk47_06d09723-c7bd-422c-b447-70dee244cc05/nmstate-handler/0.log" Feb 20 00:06:38 crc kubenswrapper[4795]: I0220 00:06:38.519739 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-kgnfd_1dfc7b5c-9302-4774-a6c8-e76ff4d60385/kube-rbac-proxy/0.log" Feb 20 00:06:38 crc kubenswrapper[4795]: I0220 00:06:38.561681 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-kgnfd_1dfc7b5c-9302-4774-a6c8-e76ff4d60385/nmstate-metrics/0.log" Feb 20 00:06:38 crc kubenswrapper[4795]: I0220 00:06:38.710898 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-rzkrx_6b614198-6804-46a3-bb1e-d8495c0d53d6/nmstate-operator/0.log" Feb 20 00:06:38 crc kubenswrapper[4795]: I0220 00:06:38.772414 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-nk7zb_6c89273b-007f-44e6-88da-f48de3a5f03b/nmstate-webhook/0.log" Feb 20 00:06:54 crc kubenswrapper[4795]: I0220 00:06:54.314427 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-s52kw_0a29e309-2974-42a7-afd9-c77d17f414d0/prometheus-operator/0.log" Feb 20 00:06:54 crc kubenswrapper[4795]: I0220 00:06:54.746256 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd_658abf91-1e8b-4182-998f-76d3ed17b836/prometheus-operator-admission-webhook/0.log" Feb 20 00:06:54 crc kubenswrapper[4795]: I0220 00:06:54.776736 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb_fb0e3807-a209-43ca-a245-64283a1d021f/prometheus-operator-admission-webhook/0.log" Feb 20 00:06:54 crc kubenswrapper[4795]: I0220 00:06:54.942244 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-ls2vk_5d03bcc6-aa94-401a-9a3b-4970f64537cd/operator/0.log" Feb 20 00:06:54 crc kubenswrapper[4795]: I0220 00:06:54.986109 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-w6sln_e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b/perses-operator/0.log" Feb 20 00:07:09 crc kubenswrapper[4795]: I0220 00:07:09.219738 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-xrsfh_1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4/kube-rbac-proxy/0.log" Feb 20 00:07:09 crc kubenswrapper[4795]: I0220 00:07:09.523798 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-frr-files/0.log" Feb 20 00:07:09 crc kubenswrapper[4795]: I0220 00:07:09.721440 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-xrsfh_1de9d50a-905a-43f1-bf1b-94d9f8ee7cc4/controller/0.log" Feb 20 00:07:09 crc kubenswrapper[4795]: I0220 00:07:09.732761 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-frr-files/0.log" Feb 20 00:07:09 crc kubenswrapper[4795]: I0220 00:07:09.734543 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-reloader/0.log" Feb 20 00:07:09 crc kubenswrapper[4795]: I0220 00:07:09.778344 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-metrics/0.log" Feb 20 00:07:09 crc kubenswrapper[4795]: I0220 00:07:09.879474 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-reloader/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.053375 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-reloader/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.087959 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-frr-files/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.096129 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-metrics/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.103762 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-metrics/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.690380 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-frr-files/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.707888 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-metrics/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.707942 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/cp-reloader/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.757192 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/controller/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.911940 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/frr-metrics/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.917552 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/kube-rbac-proxy/0.log" Feb 20 00:07:10 crc kubenswrapper[4795]: I0220 00:07:10.967581 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/kube-rbac-proxy-frr/0.log" Feb 20 00:07:11 crc kubenswrapper[4795]: I0220 00:07:11.131624 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/reloader/0.log" Feb 20 00:07:11 crc kubenswrapper[4795]: I0220 00:07:11.168865 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-5qtpm_e32c1521-9c29-4d70-b4bb-54af4127daaf/frr-k8s-webhook-server/0.log" Feb 20 00:07:11 crc kubenswrapper[4795]: I0220 00:07:11.346840 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-549cf7d797-tscrj_2eb889b2-1f23-4497-a779-5312fcd470b1/manager/0.log" Feb 20 00:07:11 crc kubenswrapper[4795]: I0220 00:07:11.606295 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7d8d766c-z8q6x_94a7e477-a2bd-4c46-8eb0-084260fade4a/webhook-server/0.log" Feb 20 00:07:11 crc kubenswrapper[4795]: I0220 00:07:11.702145 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kmbww_32ed0d55-a2df-4643-9283-e5bc8d1c993e/kube-rbac-proxy/0.log" Feb 20 00:07:12 crc kubenswrapper[4795]: I0220 00:07:12.702478 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kmbww_32ed0d55-a2df-4643-9283-e5bc8d1c993e/speaker/0.log" Feb 20 00:07:14 crc kubenswrapper[4795]: I0220 00:07:14.348733 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b7csh_3c79ff86-c25b-45b2-9f84-d33c6264cc0a/frr/0.log" Feb 20 00:07:25 crc kubenswrapper[4795]: I0220 00:07:25.563443 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6_5aba51a3-e783-497f-b58e-dcd4e631b0e9/util/0.log" Feb 20 00:07:25 crc kubenswrapper[4795]: I0220 00:07:25.720474 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6_5aba51a3-e783-497f-b58e-dcd4e631b0e9/util/0.log" Feb 20 00:07:25 crc kubenswrapper[4795]: I0220 00:07:25.777362 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6_5aba51a3-e783-497f-b58e-dcd4e631b0e9/pull/0.log" Feb 20 00:07:25 crc kubenswrapper[4795]: I0220 00:07:25.777459 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6_5aba51a3-e783-497f-b58e-dcd4e631b0e9/pull/0.log" Feb 20 00:07:25 crc kubenswrapper[4795]: I0220 00:07:25.940030 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6_5aba51a3-e783-497f-b58e-dcd4e631b0e9/util/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.012501 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6_5aba51a3-e783-497f-b58e-dcd4e631b0e9/extract/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.025961 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5twkn6_5aba51a3-e783-497f-b58e-dcd4e631b0e9/pull/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.134513 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk_42ff3cee-7522-42a5-8cc9-b52b30d45220/util/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.303539 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk_42ff3cee-7522-42a5-8cc9-b52b30d45220/pull/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.335599 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk_42ff3cee-7522-42a5-8cc9-b52b30d45220/util/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.403332 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk_42ff3cee-7522-42a5-8cc9-b52b30d45220/pull/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.511156 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk_42ff3cee-7522-42a5-8cc9-b52b30d45220/extract/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.516785 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk_42ff3cee-7522-42a5-8cc9-b52b30d45220/pull/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.574190 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08fpfwk_42ff3cee-7522-42a5-8cc9-b52b30d45220/util/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.713436 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj_8f281e72-3a5e-4abb-bbcb-7555808866be/util/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.901957 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj_8f281e72-3a5e-4abb-bbcb-7555808866be/pull/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.904302 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj_8f281e72-3a5e-4abb-bbcb-7555808866be/util/0.log" Feb 20 00:07:26 crc kubenswrapper[4795]: I0220 00:07:26.907185 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj_8f281e72-3a5e-4abb-bbcb-7555808866be/pull/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.146797 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj_8f281e72-3a5e-4abb-bbcb-7555808866be/util/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.162694 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj_8f281e72-3a5e-4abb-bbcb-7555808866be/extract/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.189447 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213b9gzj_8f281e72-3a5e-4abb-bbcb-7555808866be/pull/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.339760 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mdtnr_f457fe15-4099-4d77-8140-3297bee0a182/extract-utilities/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.544278 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mdtnr_f457fe15-4099-4d77-8140-3297bee0a182/extract-content/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.567858 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mdtnr_f457fe15-4099-4d77-8140-3297bee0a182/extract-utilities/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.588959 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mdtnr_f457fe15-4099-4d77-8140-3297bee0a182/extract-content/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.720057 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mdtnr_f457fe15-4099-4d77-8140-3297bee0a182/extract-content/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.727733 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mdtnr_f457fe15-4099-4d77-8140-3297bee0a182/extract-utilities/0.log" Feb 20 00:07:27 crc kubenswrapper[4795]: I0220 00:07:27.987413 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p266t_432a371d-d143-4da7-9332-682f52b39381/extract-utilities/0.log" Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.112447 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p266t_432a371d-d143-4da7-9332-682f52b39381/extract-utilities/0.log" Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.134183 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p266t_432a371d-d143-4da7-9332-682f52b39381/extract-content/0.log" Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.189409 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p266t_432a371d-d143-4da7-9332-682f52b39381/extract-content/0.log" Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.427091 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.427186 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.487070 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p266t_432a371d-d143-4da7-9332-682f52b39381/extract-utilities/0.log" Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.507539 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p266t_432a371d-d143-4da7-9332-682f52b39381/extract-content/0.log" Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.774575 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72_bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a/util/0.log" Feb 20 00:07:28 crc kubenswrapper[4795]: I0220 00:07:28.961359 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72_bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a/util/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.021382 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72_bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a/pull/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.103280 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mdtnr_f457fe15-4099-4d77-8140-3297bee0a182/registry-server/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.180822 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72_bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a/pull/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.371587 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72_bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a/util/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.442618 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72_bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a/pull/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.484725 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecab2g72_bf7ae7ea-19bd-44a3-b45f-7f5e2eb30c3a/extract/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.648930 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-n9qlf_c91304a6-fa59-4df4-aa17-d7d2f73d9103/marketplace-operator/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.752308 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r8t7x_4941d783-94cd-4a5c-a124-5c8751cc8494/extract-utilities/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.826109 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-p266t_432a371d-d143-4da7-9332-682f52b39381/registry-server/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.921285 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r8t7x_4941d783-94cd-4a5c-a124-5c8751cc8494/extract-content/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.944594 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r8t7x_4941d783-94cd-4a5c-a124-5c8751cc8494/extract-content/0.log" Feb 20 00:07:29 crc kubenswrapper[4795]: I0220 00:07:29.958834 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r8t7x_4941d783-94cd-4a5c-a124-5c8751cc8494/extract-utilities/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.139933 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r8t7x_4941d783-94cd-4a5c-a124-5c8751cc8494/extract-utilities/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.140478 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r8t7x_4941d783-94cd-4a5c-a124-5c8751cc8494/extract-content/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.245297 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4v92x_4002b94b-8679-454c-a721-fa900f6cde3b/extract-utilities/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.411829 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4v92x_4002b94b-8679-454c-a721-fa900f6cde3b/extract-utilities/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.438772 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4v92x_4002b94b-8679-454c-a721-fa900f6cde3b/extract-content/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.527572 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4v92x_4002b94b-8679-454c-a721-fa900f6cde3b/extract-content/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.605936 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-r8t7x_4941d783-94cd-4a5c-a124-5c8751cc8494/registry-server/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.643527 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4v92x_4002b94b-8679-454c-a721-fa900f6cde3b/extract-content/0.log" Feb 20 00:07:30 crc kubenswrapper[4795]: I0220 00:07:30.695547 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4v92x_4002b94b-8679-454c-a721-fa900f6cde3b/extract-utilities/0.log" Feb 20 00:07:31 crc kubenswrapper[4795]: I0220 00:07:31.827488 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4v92x_4002b94b-8679-454c-a721-fa900f6cde3b/registry-server/0.log" Feb 20 00:07:45 crc kubenswrapper[4795]: I0220 00:07:45.041210 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-c6bdf87db-l58fd_658abf91-1e8b-4182-998f-76d3ed17b836/prometheus-operator-admission-webhook/0.log" Feb 20 00:07:45 crc kubenswrapper[4795]: I0220 00:07:45.044302 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-s52kw_0a29e309-2974-42a7-afd9-c77d17f414d0/prometheus-operator/0.log" Feb 20 00:07:45 crc kubenswrapper[4795]: I0220 00:07:45.119340 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-c6bdf87db-ssgfb_fb0e3807-a209-43ca-a245-64283a1d021f/prometheus-operator-admission-webhook/0.log" Feb 20 00:07:45 crc kubenswrapper[4795]: I0220 00:07:45.261152 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-ls2vk_5d03bcc6-aa94-401a-9a3b-4970f64537cd/operator/0.log" Feb 20 00:07:45 crc kubenswrapper[4795]: I0220 00:07:45.287278 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-w6sln_e8a44bc2-38b3-4e8d-ab95-e5e6f861b13b/perses-operator/0.log" Feb 20 00:07:52 crc kubenswrapper[4795]: E0220 00:07:52.282635 4795 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.69:51250->38.102.83.69:37561: write tcp 38.102.83.69:51250->38.102.83.69:37561: write: broken pipe Feb 20 00:07:58 crc kubenswrapper[4795]: I0220 00:07:58.427644 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:07:58 crc kubenswrapper[4795]: I0220 00:07:58.429154 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:08:28 crc kubenswrapper[4795]: I0220 00:08:28.427661 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:08:28 crc kubenswrapper[4795]: I0220 00:08:28.428207 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:08:28 crc kubenswrapper[4795]: I0220 00:08:28.428253 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 20 00:08:28 crc kubenswrapper[4795]: I0220 00:08:28.429077 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bbe670b74801c9c3513113ce23971acfb0538080cd5b8205d654717a9d255d0d"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 00:08:28 crc kubenswrapper[4795]: I0220 00:08:28.429128 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://bbe670b74801c9c3513113ce23971acfb0538080cd5b8205d654717a9d255d0d" gracePeriod=600 Feb 20 00:08:29 crc kubenswrapper[4795]: I0220 00:08:29.403610 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="bbe670b74801c9c3513113ce23971acfb0538080cd5b8205d654717a9d255d0d" exitCode=0 Feb 20 00:08:29 crc kubenswrapper[4795]: I0220 00:08:29.403704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"bbe670b74801c9c3513113ce23971acfb0538080cd5b8205d654717a9d255d0d"} Feb 20 00:08:29 crc kubenswrapper[4795]: I0220 00:08:29.404281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerStarted","Data":"7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be"} Feb 20 00:08:29 crc kubenswrapper[4795]: I0220 00:08:29.404306 4795 scope.go:117] "RemoveContainer" containerID="f417d522e5357173b817e8b838d944aa5e17352006ff9b8fed75f316b86d76c7" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.850619 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8hsl9"] Feb 20 00:08:44 crc kubenswrapper[4795]: E0220 00:08:44.851671 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerName="extract-utilities" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.851686 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerName="extract-utilities" Feb 20 00:08:44 crc kubenswrapper[4795]: E0220 00:08:44.851703 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerName="extract-content" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.851711 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerName="extract-content" Feb 20 00:08:44 crc kubenswrapper[4795]: E0220 00:08:44.851731 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="extract-content" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.851737 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="extract-content" Feb 20 00:08:44 crc kubenswrapper[4795]: E0220 00:08:44.851750 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="registry-server" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.851757 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="registry-server" Feb 20 00:08:44 crc kubenswrapper[4795]: E0220 00:08:44.851776 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerName="registry-server" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.851782 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerName="registry-server" Feb 20 00:08:44 crc kubenswrapper[4795]: E0220 00:08:44.851789 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="extract-utilities" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.851795 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="extract-utilities" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.852032 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="20cdd942-7d19-4f30-9952-d7f228b9ce25" containerName="registry-server" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.852056 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="40dbe861-a8a0-4f3f-9d04-5a592dbce6ca" containerName="registry-server" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.853798 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.867788 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hsl9"] Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.887803 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-catalog-content\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.887869 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8czgq\" (UniqueName: \"kubernetes.io/projected/99349ff2-bb13-4040-a617-0e7f78e9e3ed-kube-api-access-8czgq\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.888001 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-utilities\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.989878 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-catalog-content\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.990160 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8czgq\" (UniqueName: \"kubernetes.io/projected/99349ff2-bb13-4040-a617-0e7f78e9e3ed-kube-api-access-8czgq\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.990306 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-utilities\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.990441 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-catalog-content\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:44 crc kubenswrapper[4795]: I0220 00:08:44.990652 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-utilities\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:45 crc kubenswrapper[4795]: I0220 00:08:45.009575 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8czgq\" (UniqueName: \"kubernetes.io/projected/99349ff2-bb13-4040-a617-0e7f78e9e3ed-kube-api-access-8czgq\") pod \"redhat-marketplace-8hsl9\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:45 crc kubenswrapper[4795]: I0220 00:08:45.196641 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:45 crc kubenswrapper[4795]: I0220 00:08:45.689543 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hsl9"] Feb 20 00:08:46 crc kubenswrapper[4795]: I0220 00:08:46.598366 4795 generic.go:334] "Generic (PLEG): container finished" podID="99349ff2-bb13-4040-a617-0e7f78e9e3ed" containerID="cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69" exitCode=0 Feb 20 00:08:46 crc kubenswrapper[4795]: I0220 00:08:46.598412 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hsl9" event={"ID":"99349ff2-bb13-4040-a617-0e7f78e9e3ed","Type":"ContainerDied","Data":"cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69"} Feb 20 00:08:46 crc kubenswrapper[4795]: I0220 00:08:46.598774 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hsl9" event={"ID":"99349ff2-bb13-4040-a617-0e7f78e9e3ed","Type":"ContainerStarted","Data":"4adb63b598af0614119b00d6faa7a9924ef2d0a6f681c9bc978a176a914368ef"} Feb 20 00:08:48 crc kubenswrapper[4795]: I0220 00:08:48.619372 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hsl9" event={"ID":"99349ff2-bb13-4040-a617-0e7f78e9e3ed","Type":"ContainerStarted","Data":"56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011"} Feb 20 00:08:49 crc kubenswrapper[4795]: I0220 00:08:49.631094 4795 generic.go:334] "Generic (PLEG): container finished" podID="99349ff2-bb13-4040-a617-0e7f78e9e3ed" containerID="56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011" exitCode=0 Feb 20 00:08:49 crc kubenswrapper[4795]: I0220 00:08:49.631364 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hsl9" event={"ID":"99349ff2-bb13-4040-a617-0e7f78e9e3ed","Type":"ContainerDied","Data":"56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011"} Feb 20 00:08:51 crc kubenswrapper[4795]: I0220 00:08:51.655197 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hsl9" event={"ID":"99349ff2-bb13-4040-a617-0e7f78e9e3ed","Type":"ContainerStarted","Data":"8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09"} Feb 20 00:08:51 crc kubenswrapper[4795]: I0220 00:08:51.679593 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8hsl9" podStartSLOduration=4.211116367 podStartE2EDuration="7.679574046s" podCreationTimestamp="2026-02-20 00:08:44 +0000 UTC" firstStartedPulling="2026-02-20 00:08:46.600484389 +0000 UTC m=+9637.793002253" lastFinishedPulling="2026-02-20 00:08:50.068942058 +0000 UTC m=+9641.261459932" observedRunningTime="2026-02-20 00:08:51.67937744 +0000 UTC m=+9642.871895304" watchObservedRunningTime="2026-02-20 00:08:51.679574046 +0000 UTC m=+9642.872091900" Feb 20 00:08:55 crc kubenswrapper[4795]: I0220 00:08:55.197474 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:55 crc kubenswrapper[4795]: I0220 00:08:55.197974 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:55 crc kubenswrapper[4795]: I0220 00:08:55.250899 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:55 crc kubenswrapper[4795]: I0220 00:08:55.761942 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:56 crc kubenswrapper[4795]: I0220 00:08:56.036487 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hsl9"] Feb 20 00:08:57 crc kubenswrapper[4795]: I0220 00:08:57.726673 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8hsl9" podUID="99349ff2-bb13-4040-a617-0e7f78e9e3ed" containerName="registry-server" containerID="cri-o://8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09" gracePeriod=2 Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.248477 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.377656 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-utilities\") pod \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.377710 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-catalog-content\") pod \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.377955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8czgq\" (UniqueName: \"kubernetes.io/projected/99349ff2-bb13-4040-a617-0e7f78e9e3ed-kube-api-access-8czgq\") pod \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\" (UID: \"99349ff2-bb13-4040-a617-0e7f78e9e3ed\") " Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.378656 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-utilities" (OuterVolumeSpecName: "utilities") pod "99349ff2-bb13-4040-a617-0e7f78e9e3ed" (UID: "99349ff2-bb13-4040-a617-0e7f78e9e3ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.382933 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99349ff2-bb13-4040-a617-0e7f78e9e3ed-kube-api-access-8czgq" (OuterVolumeSpecName: "kube-api-access-8czgq") pod "99349ff2-bb13-4040-a617-0e7f78e9e3ed" (UID: "99349ff2-bb13-4040-a617-0e7f78e9e3ed"). InnerVolumeSpecName "kube-api-access-8czgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.409647 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99349ff2-bb13-4040-a617-0e7f78e9e3ed" (UID: "99349ff2-bb13-4040-a617-0e7f78e9e3ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.480279 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8czgq\" (UniqueName: \"kubernetes.io/projected/99349ff2-bb13-4040-a617-0e7f78e9e3ed-kube-api-access-8czgq\") on node \"crc\" DevicePath \"\"" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.480316 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.480326 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99349ff2-bb13-4040-a617-0e7f78e9e3ed-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.741138 4795 generic.go:334] "Generic (PLEG): container finished" podID="99349ff2-bb13-4040-a617-0e7f78e9e3ed" containerID="8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09" exitCode=0 Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.741778 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hsl9" event={"ID":"99349ff2-bb13-4040-a617-0e7f78e9e3ed","Type":"ContainerDied","Data":"8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09"} Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.741834 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8hsl9" event={"ID":"99349ff2-bb13-4040-a617-0e7f78e9e3ed","Type":"ContainerDied","Data":"4adb63b598af0614119b00d6faa7a9924ef2d0a6f681c9bc978a176a914368ef"} Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.741873 4795 scope.go:117] "RemoveContainer" containerID="8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.742130 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8hsl9" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.775256 4795 scope.go:117] "RemoveContainer" containerID="56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.801051 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hsl9"] Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.811144 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8hsl9"] Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.822824 4795 scope.go:117] "RemoveContainer" containerID="cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.867341 4795 scope.go:117] "RemoveContainer" containerID="8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09" Feb 20 00:08:58 crc kubenswrapper[4795]: E0220 00:08:58.867703 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09\": container with ID starting with 8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09 not found: ID does not exist" containerID="8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.867743 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09"} err="failed to get container status \"8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09\": rpc error: code = NotFound desc = could not find container \"8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09\": container with ID starting with 8630eb78752030da728fb607630fea61e7430267e5b9298cff9f74b4653bba09 not found: ID does not exist" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.867763 4795 scope.go:117] "RemoveContainer" containerID="56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011" Feb 20 00:08:58 crc kubenswrapper[4795]: E0220 00:08:58.868075 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011\": container with ID starting with 56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011 not found: ID does not exist" containerID="56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.868105 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011"} err="failed to get container status \"56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011\": rpc error: code = NotFound desc = could not find container \"56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011\": container with ID starting with 56c2f512ffb3e5146cc6b5ad4d6b00d311420e2ad85ca35ad134f00ed3ab8011 not found: ID does not exist" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.868125 4795 scope.go:117] "RemoveContainer" containerID="cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69" Feb 20 00:08:58 crc kubenswrapper[4795]: E0220 00:08:58.868397 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69\": container with ID starting with cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69 not found: ID does not exist" containerID="cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69" Feb 20 00:08:58 crc kubenswrapper[4795]: I0220 00:08:58.868430 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69"} err="failed to get container status \"cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69\": rpc error: code = NotFound desc = could not find container \"cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69\": container with ID starting with cc04e107df9387fa933d9b2754e61d9ccb95dd1a55a0a1b37be3a3de02132c69 not found: ID does not exist" Feb 20 00:08:59 crc kubenswrapper[4795]: I0220 00:08:59.524772 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99349ff2-bb13-4040-a617-0e7f78e9e3ed" path="/var/lib/kubelet/pods/99349ff2-bb13-4040-a617-0e7f78e9e3ed/volumes" Feb 20 00:09:58 crc kubenswrapper[4795]: I0220 00:09:58.427833 4795 generic.go:334] "Generic (PLEG): container finished" podID="d06c32e0-d01f-47e9-871b-9fdfb391d796" containerID="35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5" exitCode=0 Feb 20 00:09:58 crc kubenswrapper[4795]: I0220 00:09:58.427937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gv265/must-gather-ltjmp" event={"ID":"d06c32e0-d01f-47e9-871b-9fdfb391d796","Type":"ContainerDied","Data":"35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5"} Feb 20 00:09:58 crc kubenswrapper[4795]: I0220 00:09:58.429226 4795 scope.go:117] "RemoveContainer" containerID="35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5" Feb 20 00:09:58 crc kubenswrapper[4795]: I0220 00:09:58.785138 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gv265_must-gather-ltjmp_d06c32e0-d01f-47e9-871b-9fdfb391d796/gather/0.log" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.041775 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gv265/must-gather-ltjmp"] Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.042793 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-gv265/must-gather-ltjmp" podUID="d06c32e0-d01f-47e9-871b-9fdfb391d796" containerName="copy" containerID="cri-o://340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70" gracePeriod=2 Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.067000 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gv265/must-gather-ltjmp"] Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.511436 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gv265_must-gather-ltjmp_d06c32e0-d01f-47e9-871b-9fdfb391d796/copy/0.log" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.513145 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.516160 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5kfl\" (UniqueName: \"kubernetes.io/projected/d06c32e0-d01f-47e9-871b-9fdfb391d796-kube-api-access-g5kfl\") pod \"d06c32e0-d01f-47e9-871b-9fdfb391d796\" (UID: \"d06c32e0-d01f-47e9-871b-9fdfb391d796\") " Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.516390 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d06c32e0-d01f-47e9-871b-9fdfb391d796-must-gather-output\") pod \"d06c32e0-d01f-47e9-871b-9fdfb391d796\" (UID: \"d06c32e0-d01f-47e9-871b-9fdfb391d796\") " Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.552374 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06c32e0-d01f-47e9-871b-9fdfb391d796-kube-api-access-g5kfl" (OuterVolumeSpecName: "kube-api-access-g5kfl") pod "d06c32e0-d01f-47e9-871b-9fdfb391d796" (UID: "d06c32e0-d01f-47e9-871b-9fdfb391d796"). InnerVolumeSpecName "kube-api-access-g5kfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.567836 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gv265_must-gather-ltjmp_d06c32e0-d01f-47e9-871b-9fdfb391d796/copy/0.log" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.570836 4795 generic.go:334] "Generic (PLEG): container finished" podID="d06c32e0-d01f-47e9-871b-9fdfb391d796" containerID="340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70" exitCode=143 Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.570888 4795 scope.go:117] "RemoveContainer" containerID="340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.571014 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gv265/must-gather-ltjmp" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.604685 4795 scope.go:117] "RemoveContainer" containerID="35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.620482 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5kfl\" (UniqueName: \"kubernetes.io/projected/d06c32e0-d01f-47e9-871b-9fdfb391d796-kube-api-access-g5kfl\") on node \"crc\" DevicePath \"\"" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.670807 4795 scope.go:117] "RemoveContainer" containerID="340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70" Feb 20 00:10:08 crc kubenswrapper[4795]: E0220 00:10:08.671312 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70\": container with ID starting with 340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70 not found: ID does not exist" containerID="340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.671384 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70"} err="failed to get container status \"340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70\": rpc error: code = NotFound desc = could not find container \"340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70\": container with ID starting with 340d2615d25276b88141337bb1421c18e4c691a87d218a924ca2427d0a329e70 not found: ID does not exist" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.671413 4795 scope.go:117] "RemoveContainer" containerID="35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5" Feb 20 00:10:08 crc kubenswrapper[4795]: E0220 00:10:08.671713 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5\": container with ID starting with 35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5 not found: ID does not exist" containerID="35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.671747 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5"} err="failed to get container status \"35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5\": rpc error: code = NotFound desc = could not find container \"35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5\": container with ID starting with 35f5634e272e68850396058b8dbd9d80a146c6ccfa053b21de355451b37156a5 not found: ID does not exist" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.749055 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06c32e0-d01f-47e9-871b-9fdfb391d796-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d06c32e0-d01f-47e9-871b-9fdfb391d796" (UID: "d06c32e0-d01f-47e9-871b-9fdfb391d796"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 00:10:08 crc kubenswrapper[4795]: I0220 00:10:08.825215 4795 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d06c32e0-d01f-47e9-871b-9fdfb391d796-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 20 00:10:09 crc kubenswrapper[4795]: I0220 00:10:09.537606 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06c32e0-d01f-47e9-871b-9fdfb391d796" path="/var/lib/kubelet/pods/d06c32e0-d01f-47e9-871b-9fdfb391d796/volumes" Feb 20 00:10:28 crc kubenswrapper[4795]: I0220 00:10:28.428104 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:10:28 crc kubenswrapper[4795]: I0220 00:10:28.428587 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:10:58 crc kubenswrapper[4795]: I0220 00:10:58.427794 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:10:58 crc kubenswrapper[4795]: I0220 00:10:58.429440 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:11:28 crc kubenswrapper[4795]: I0220 00:11:28.427112 4795 patch_prober.go:28] interesting pod/machine-config-daemon-fxj5d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 00:11:28 crc kubenswrapper[4795]: I0220 00:11:28.427715 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 00:11:28 crc kubenswrapper[4795]: I0220 00:11:28.427759 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" Feb 20 00:11:28 crc kubenswrapper[4795]: I0220 00:11:28.428588 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be"} pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 00:11:28 crc kubenswrapper[4795]: I0220 00:11:28.428637 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" containerName="machine-config-daemon" containerID="cri-o://7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" gracePeriod=600 Feb 20 00:11:29 crc kubenswrapper[4795]: I0220 00:11:29.360500 4795 generic.go:334] "Generic (PLEG): container finished" podID="7591bc58-96f5-486a-8653-0ad93938b019" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" exitCode=0 Feb 20 00:11:29 crc kubenswrapper[4795]: I0220 00:11:29.360545 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" event={"ID":"7591bc58-96f5-486a-8653-0ad93938b019","Type":"ContainerDied","Data":"7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be"} Feb 20 00:11:29 crc kubenswrapper[4795]: I0220 00:11:29.360583 4795 scope.go:117] "RemoveContainer" containerID="bbe670b74801c9c3513113ce23971acfb0538080cd5b8205d654717a9d255d0d" Feb 20 00:11:29 crc kubenswrapper[4795]: E0220 00:11:29.544550 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:11:30 crc kubenswrapper[4795]: I0220 00:11:30.372433 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:11:30 crc kubenswrapper[4795]: E0220 00:11:30.373077 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:11:45 crc kubenswrapper[4795]: I0220 00:11:45.512027 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:11:45 crc kubenswrapper[4795]: E0220 00:11:45.512838 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:11:58 crc kubenswrapper[4795]: I0220 00:11:58.512017 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:11:58 crc kubenswrapper[4795]: E0220 00:11:58.512873 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:12:09 crc kubenswrapper[4795]: I0220 00:12:09.522958 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:12:09 crc kubenswrapper[4795]: E0220 00:12:09.523739 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:12:24 crc kubenswrapper[4795]: I0220 00:12:24.511732 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:12:24 crc kubenswrapper[4795]: E0220 00:12:24.512685 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:12:29 crc kubenswrapper[4795]: I0220 00:12:29.850319 4795 scope.go:117] "RemoveContainer" containerID="8c4945d0f178c5671668c808efa2896dd2d5275ec4ad197350f883d68eb9ae13" Feb 20 00:12:29 crc kubenswrapper[4795]: I0220 00:12:29.871339 4795 scope.go:117] "RemoveContainer" containerID="29b21899c393a3503ba5ae6dcf3912d8e17c4a592516cd085863f44ad997cd91" Feb 20 00:12:29 crc kubenswrapper[4795]: I0220 00:12:29.891403 4795 scope.go:117] "RemoveContainer" containerID="1bcbbf1fb3a51b7809b2c9d3857d7d30bd666a5c7ea1fd4ea2c2765511ea4288" Feb 20 00:12:39 crc kubenswrapper[4795]: I0220 00:12:39.518625 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:12:39 crc kubenswrapper[4795]: E0220 00:12:39.519635 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:12:53 crc kubenswrapper[4795]: I0220 00:12:53.514879 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:12:53 crc kubenswrapper[4795]: E0220 00:12:53.516065 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:13:06 crc kubenswrapper[4795]: I0220 00:13:06.511775 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:13:06 crc kubenswrapper[4795]: E0220 00:13:06.512771 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:13:21 crc kubenswrapper[4795]: I0220 00:13:21.512321 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:13:21 crc kubenswrapper[4795]: E0220 00:13:21.513000 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:13:33 crc kubenswrapper[4795]: I0220 00:13:33.512746 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:13:33 crc kubenswrapper[4795]: E0220 00:13:33.513854 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:13:48 crc kubenswrapper[4795]: I0220 00:13:48.512022 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:13:48 crc kubenswrapper[4795]: E0220 00:13:48.512968 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:14:01 crc kubenswrapper[4795]: I0220 00:14:01.511800 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:14:01 crc kubenswrapper[4795]: E0220 00:14:01.512834 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:14:15 crc kubenswrapper[4795]: I0220 00:14:15.511553 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:14:15 crc kubenswrapper[4795]: E0220 00:14:15.512516 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019" Feb 20 00:14:26 crc kubenswrapper[4795]: I0220 00:14:26.512528 4795 scope.go:117] "RemoveContainer" containerID="7fd14f8119c4c7db340cc279acfb5316ac699c881833b459f5656e39cece69be" Feb 20 00:14:26 crc kubenswrapper[4795]: E0220 00:14:26.513821 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fxj5d_openshift-machine-config-operator(7591bc58-96f5-486a-8653-0ad93938b019)\"" pod="openshift-machine-config-operator/machine-config-daemon-fxj5d" podUID="7591bc58-96f5-486a-8653-0ad93938b019"